<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>News | Urban Analytics Lab | Singapore</title><link>https://ual.sg/post/</link><atom:link href="https://ual.sg/post/index.xml" rel="self" type="application/rss+xml"/><description>News</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Sat, 07 Mar 2026 17:59:21 +0800</lastBuildDate><item><title>It is not always greener on the other side</title><link>https://ual.sg/post/2026/03/07/it-is-not-always-greener-on-the-other-side/</link><pubDate>Sat, 07 Mar 2026 17:59:21 +0800</pubDate><guid>https://ual.sg/post/2026/03/07/it-is-not-always-greener-on-the-other-side/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Quintana M, Liu F, Torkko J, Gu Y, Liang X, Hou Y, Ito K, Zhu Y, Abdelrahman M, Toivonen T, Lu Y, Biljecki F (2026): It is not always greener on the other side: Greenery perception across demographics and personalities in multiple cities. Landscape and Urban Planning, 271: 105618. &lt;a href="https://doi.org/10.1016/j.landurbplan.2026.105618" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2026.105618&lt;/a&gt; &lt;a href="https://ual.sg/publication/2026-land-greenery/2026-land-greenery.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/matias-quintana/"&gt;Matias Quintana&lt;/a&gt;.
It extends his project &lt;a href="https://github.com/matqr/specs" target="_blank" rel="noopener"&gt;SPECS&lt;/a&gt;, which &lt;a href="https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/"&gt;we featured earlier&lt;/a&gt;.
Congratulations on his new publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The GitHub repository with the open-source code we developed can be found &lt;a href="https://github.com/matqr/greenery-perception" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/1_hu_818c1510019afb60.webp 400w,
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/1_hu_f1aff9b31c1b2396.webp 760w,
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/1_hu_2ecb62238a954176.webp 1200w"
src="https://ual.sg/post/2026/03/07/it-is-not-always-greener-on-the-other-side/1_hu_818c1510019afb60.webp"
width="757"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Perceived and measured greenery (GVI) correlates positively and moderately everywhere.&lt;/li&gt;
&lt;li&gt;Perceived greenery regularly overestimates measured greenery (GVI) across cities.&lt;/li&gt;
&lt;li&gt;Almost no demographic and personality trait has influence on perceiving greenery.&lt;/li&gt;
&lt;li&gt;People’s place of residence is the only one that influences perception significantly.&lt;/li&gt;
&lt;li&gt;Greenery placement trumps proximity in perceived greenery.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/2_hu_883a13d15cad90b8.webp 400w,
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/2_hu_3a144a529c9f56a9.webp 760w,
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/2_hu_d16a3973d70d8af5.webp 1200w"
src="https://ual.sg/post/2026/03/07/it-is-not-always-greener-on-the-other-side/2_hu_883a13d15cad90b8.webp"
width="760"
height="742"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Quantifying and assessing urban greenery is consequential for planning and development, reflecting the everlasting importance of green spaces for multiple climate and well-being dimensions of cities. Evaluation can be broadly grouped into objective (e.g., measuring the amount of greenery) and subjective (e.g., polling the perception of people) approaches, which may differ – what people see and feel about how green a place is might not match the measurements of the actual amount of vegetation. In this work, we advance the state of the art by measuring such differences and explaining them through human, geographic, and spatial dimensions. The experiments rely on contextual information extracted from street view imagery and a comprehensive urban visual perception survey collected from 1000 people across five countries with their extensive demographic and personality information. We analyze the discrepancies between objective measures (e.g., Green View Index (GVI)) and subjective scores (e.g., pairwise ratings), examining whether they can be explained by a variety of human and visual factors such as age group and spatial variation of greenery in the scene. The findings reveal that such discrepancies are comparable around the world and that demographics and personality do not play a significant role in perception. Further, while perceived and measured greenery correlate consistently across geographies (both where people and where imagery are from), where people live plays a significant role in explaining perceptual differences, with these two, as the top among seven, features that influences perceived greenery the most. This location influence suggests that cultural, environmental, and experiential factors substantially shape how individuals observe greenery in cities. We also found that the spatial arrangement of greenery in the sight, rather than its proximity to the person, influences perception. Our study provides a new understanding of the deep relationships between objective and subjective street-level greenery assessments, contributing to a more human-centric design of green urban environments.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2026-land-greenery/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1mjg6cUG5e8ed" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2026-04-25.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2026-land-greenery/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/page-one_hu_b99b6b6e4cc9fbbe.webp 400w,
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/page-one_hu_5dd42599136cb54a.webp 760w,
/post/2026/03/07/it-is-not-always-greener-on-the-other-side/page-one_hu_758ad2c4d0316363.webp 1200w"
src="https://ual.sg/post/2026/03/07/it-is-not-always-greener-on-the-other-side/page-one_hu_b99b6b6e4cc9fbbe.webp"
width="562"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2026_land_greenery&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Quintana, Matias and Liu, Fangqi and Torkko, Jussi and Gu, Youlong and Liang, Xiucheng and Hou, Yujun and Ito, Koichi and Zhu, Yihan and Abdelrahman, Mahmoud and Toivonen, Tuuli and Lu, Yi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2026.105618}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105618}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{271}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{It is not always greener on the other side: Greenery perception across demographics and personalities in multiple cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2026}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Call for Abstracts - Workshop 'State of the Art and Outlook of Street-Level Imagery for Urban Science'</title><link>https://ual.sg/post/2026/03/01/call-for-abstracts-workshop-state-of-the-art-and-outlook-of-street-level-imagery-for-urban-science/</link><pubDate>Sun, 01 Mar 2026 19:30:01 +0800</pubDate><guid>https://ual.sg/post/2026/03/01/call-for-abstracts-workshop-state-of-the-art-and-outlook-of-street-level-imagery-for-urban-science/</guid><description>&lt;p&gt;We are happy to announce a workshop that will be part of the &lt;a href="https://geoaiconference.org/" target="_blank" rel="noopener"&gt;1st International Conference on Geospatial Artificial Intelligence (GeoAI 2026)&lt;/a&gt;, which will take place in Ghent, Belgium on 3-6 June 2026.
We welcome abstracts on the topic.&lt;/p&gt;
&lt;p&gt;Organisers: &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, &lt;a href="https://ual.sg/author/xiaobing-wei/"&gt;Xiaobing Wei&lt;/a&gt;, &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;, Weiming Huang, Fangli Guan, and Stephen Law.&lt;/p&gt;
&lt;p&gt;Deadline: 30 March 2026.&lt;/p&gt;
&lt;p&gt;More information: see &lt;a href="https://geoaiconference.org/?page_id=200" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;GeoAI 2026 is organised by &lt;a href="https://geoaiugent.wordpress.com/" target="_blank" rel="noopener"&gt;The GeoAI Research Center at Ghent University&lt;/a&gt; (chairs: Haosheng Huang and Nico Van de Weghe).&lt;/p&gt;</description></item><item><title>Seminar Rethinking cities in the age of AI</title><link>https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/</link><pubDate>Fri, 06 Feb 2026 19:30:01 +0800</pubDate><guid>https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/</guid><description>&lt;p&gt;As part of &lt;a href="https://ual.sg/seminars/"&gt;our series of seminars&lt;/a&gt;, we hosted a joint seminar with guests from Australia, Korea, and China:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The Street-Smart City: Using AI to Decode Urban Life from the Ground Up, by &lt;a href="https://www.arch.tsinghua.edu.cn/info/FUrban%20Planning%20and%20Design/2133" target="_blank" rel="noopener"&gt;Assoc Prof Yuan Lai&lt;/a&gt;, Tsinghua University&lt;/li&gt;
&lt;li&gt;AI-aided Design for Participatory Urban Planning and Design, by &lt;a href="https://gses.snu.ac.kr/people/faculty/7" target="_blank" rel="noopener"&gt;Assoc Prof Steven Jige Quan&lt;/a&gt;, Seoul National University&lt;/li&gt;
&lt;li&gt;Quantum Computing and the City, by &lt;a href="https://www.qut.edu.au/about/our-people/academic-profiles/tan.yigitcanlar" target="_blank" rel="noopener"&gt;Prof Tan Yigitcanlar&lt;/a&gt;, Queensland University of Technology&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2026-01-28_hu_cc7fb1a54760c5bb.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2026-01-28_hu_b150ebfa140c7318.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2026-01-28_hu_16c95840f91cca2f.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2026-01-28_hu_cc7fb1a54760c5bb.webp"
width="538"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The event attracted a full lecture room!
Many thanks to our distinguished guest lecturers and everyone for attending.
Our guests have delivered a series of compelling talks followed by an engaging panel session.&lt;/p&gt;
&lt;p&gt;Check out also the special issue &lt;a href="https://www.sciencedirect.com/special-issue/328646/quantum-computing-and-the-city" target="_blank" rel="noopener"&gt;Quantum Computing and the City&lt;/a&gt; in &lt;em&gt;Cities&lt;/em&gt; that our guests are editing.&lt;/p&gt;
&lt;p&gt;This session was hosted by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; and &lt;a href="https://shengxiaoli.wordpress.com/" target="_blank" rel="noopener"&gt;Shengxiao (Alex) Li&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Stay tuned for the upcoming events.
Learn more about our seminars &lt;a href="https://ual.sg/seminars/"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;In the meantime, check out some photos below.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/1_hu_1e6dd48e89564793.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/1_hu_4a5c2fcd30763ceb.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/1_hu_24180003c1170cfe.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/1_hu_1e6dd48e89564793.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2_hu_898c6087fd744b36.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2_hu_96cfdb0f2eea4f5e.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2_hu_5927e184409d8846.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/2_hu_898c6087fd744b36.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/3_hu_b58bceccb47238bb.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/3_hu_4c27518c18a7b40e.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/3_hu_2a377b4b42b07801.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/3_hu_b58bceccb47238bb.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/4_hu_65db40235a065448.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/4_hu_49b9131948d5db11.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/4_hu_ecc9d7b9ed527950.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/4_hu_65db40235a065448.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/5_hu_6b9dc8be5c8222f.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/5_hu_7343b88ddb5ffbe0.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/5_hu_bbe0c1e16bb8fb82.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/5_hu_6b9dc8be5c8222f.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/6_hu_9f57324b34e4d89a.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/6_hu_419a4e51b2d9500c.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/6_hu_12a039cddcaa2b0a.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/6_hu_9f57324b34e4d89a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/7_hu_fe5f4eb51c009b8b.webp 400w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/7_hu_8a415f3d50ab401b.webp 760w,
/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/7_hu_bab88e34e027e231.webp 1200w"
src="https://ual.sg/post/2026/02/06/seminar-rethinking-cities-in-the-age-of-ai/7_hu_fe5f4eb51c009b8b.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Congratulations to Winston Yap on his PhD!</title><link>https://ual.sg/post/2025/12/01/congratulations-to-winston-yap-on-his-phd/</link><pubDate>Mon, 01 Dec 2025 09:30:01 +0800</pubDate><guid>https://ual.sg/post/2025/12/01/congratulations-to-winston-yap-on-his-phd/</guid><description>&lt;p&gt;On 11 November 2025, &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt; has defended his PhD thesis &lt;em&gt;Urban Graph Analytics: Connecting Cities, Data, and People&lt;/em&gt;.
Congratulations!&lt;/p&gt;
&lt;p&gt;Winston has started his doctoral studies in 2021, following his graduation as Master of Urban Planning from NUS and research experience in academia.&lt;/p&gt;
&lt;p&gt;During his PhD, he has advanced the application of urban graphs in urban analytics and demonstrated how can they serve as a powerful tool in urban planning.
Winston&amp;rsquo;s work was published in leading journals, such as &lt;em&gt;Nature Sustainability&lt;/em&gt; and &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt;.
During his PhD, he has been invited to give talks at other institutions, and has conducted research visits at MIT and at a startup in Japan.
Further, he was awarded the Singapore Data Science Consortium Dissertation Fellowship and was recognised as a World Cities Summit (WCS) Young Leader and participated in associated activities.&lt;/p&gt;
&lt;p&gt;The committee members were Ye Zhang, Adrian Chong, Rudi Stouffs, and Filip Biljecki (thesis advisor).&lt;/p&gt;
&lt;p&gt;We wish him all the best and lots of continued successes, and we thank him for the collaboration in the past years.
Winston has substantially contributed to our research group and has been instrumental in shaping its research agenda.
He will continue his career as a Postdoctoral Associate at Cornell University, earning the prestigious Ezra Systems Scholars fellowship.&lt;/p&gt;
&lt;p&gt;To learn more about Winston&amp;rsquo;s work, visit &lt;a href="https://www.winstonyym.com" target="_blank" rel="noopener"&gt;his website&lt;/a&gt; and &lt;a href="https://scholar.google.com/citations?user=p14e60QAAAAJ&amp;amp;hl=en" target="_blank" rel="noopener"&gt;his Google Scholar profile&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Short abstract of his thesis:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Cities are under growing pressure to become both sustainable and equitable as they confront rapid urbanization and the slow-burn realities of climate change. Meeting these challenges increasingly depends on data-driven insights enabled by the explosion of large-scale geospatial data and advances in computational methods that reveal how cities grow, function, and evolve. Yet most computational efforts still develop in disciplinary silos, leaving them ill-equipped to address complex, interlocking issues such as housing, transportation, and climate resilience. To bridge this gap, this research introduces urban graph analytics, an extension of traditional street networks, that enables more integrated and interpretable, systems-based urban computational planning. Urban graphs offer a standardized, readily extensible framework for capturing the multiscale connections among people, infrastructure, and the urban environment. This research makes three primary contributions: 1) Conceptual development of urban graphs and their relevance for computational planning and design complex; 2) Implementation of open-source tools and datasets to make urban graph generation and analytics accessible and adaptable for any city worldwide; 3) Application of city graphs to real-world problems, such as advancing fair climate action across multiple cities and quantifying urban health outcomes. At its core, I hope to demonstrate how urban graphs can serve as a powerful tool for integrated, data-informed urban planning, helping cities achieve sustainability goals and improve quality of life for their residents.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/12/01/congratulations-to-winston-yap-on-his-phd/2025_Winston_2_hu_5770c9fec07b1c27.webp 400w,
/post/2025/12/01/congratulations-to-winston-yap-on-his-phd/2025_Winston_2_hu_1b28f736e740287f.webp 760w,
/post/2025/12/01/congratulations-to-winston-yap-on-his-phd/2025_Winston_2_hu_55e4382fc5be2eaa.webp 1200w"
src="https://ual.sg/post/2025/12/01/congratulations-to-winston-yap-on-his-phd/2025_Winston_2_hu_5770c9fec07b1c27.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Want to join us in your PhD journey? Read &lt;a href="https://ual.sg/opportunities/application-guide/"&gt;this guide&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Our nighttime SVI research featured in the media</title><link>https://ual.sg/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/</link><pubDate>Sun, 23 Nov 2025 14:28:32 +0800</pubDate><guid>https://ual.sg/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/</guid><description>&lt;p&gt;Research by &lt;a href="https://ual.sg/author/zicheng-fan/"&gt;Zicheng Fan&lt;/a&gt; and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; on nighttime street view imagery got featured on the front page of &lt;a href="https://www.zaobao.com.sg" target="_blank" rel="noopener"&gt;Lianhe Zaobao&lt;/a&gt;, the largest Singaporean Chinese-language newspaper!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/1_hu_455240058f26beb0.webp 400w,
/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/1_hu_6bd5673ca55de9ca.webp 760w,
/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/1_hu_fb4da75a21f6eaca.webp 1200w"
src="https://ual.sg/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/1_hu_455240058f26beb0.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Using a new approach, the study has mapped Singapore&amp;rsquo;s night-time brightness, revealing contrasts among the city-state&amp;rsquo;s regions. The study found that eastern residential areas, which are denser and more active, are brighter than the western zones, which are home to more industrial estates and nature reserves. We appreciate that our work has been put in the spotlight.&lt;/p&gt;
&lt;p&gt;Link to the online article (in Chinese) is &lt;a href="https://www.zaobao.com.sg/news/singapore/story20251109-7716374" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;国大模型绘制新加坡“光地图” 东部夜间最亮西部较暗
(English: The National University of Singapore (NUS) model depicts a &amp;ldquo;light map&amp;rdquo; of Singapore, showing the east being the brightest at night and the west being darker).&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-scs-night-svi/"&gt;paper&lt;/a&gt; published in Sustainable Cities and Society.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/2_hu_b2151c6c8f33672e.webp 400w,
/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/2_hu_364a1da7d15cbe22.webp 760w,
/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/2_hu_af5f2cc4e9e109e4.webp 1200w"
src="https://ual.sg/post/2025/11/23/our-nighttime-svi-research-featured-in-the-media/2_hu_b2151c6c8f33672e.webp"
width="760"
height="458"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>VoxCity: a one-stop Python package for open geospatial data integration, 3D city model generation, and urban environment simulation</title><link>https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/</link><pubDate>Mon, 17 Nov 2025 11:20:02 +0800</pubDate><guid>https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/1_hu_e45ec80cb7d2595d.webp 400w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/1_hu_dfacf25688aaff77.webp 760w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/1_hu_e7877946e9037530.webp 1200w"
src="https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/1_hu_e45ec80cb7d2595d.webp"
width="760"
height="332"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;We are glad to introduce our latest open project: VoxCity, which was also published as a paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Fujiwara K, Tsurumi R, Kiyono T, Fan Z, Liang X, Lei B, Yap W, Ito K, Biljecki F (2026): VoxCity: A seamless framework for open geospatial data integration, grid-based semantic 3D city model generation, and urban environment simulation. Computers, Environment and Urban Systems 123: 102366. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102366" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2025.102366&lt;/a&gt; &lt;a href="https://ual.sg/publication/2026-ceus-voxcity/2026-ceus-voxcity.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/kunihiko-fujiwara/"&gt;Kunihiko Fujiwara&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;VoxCity is a Python package that provides a seamless solution for grid-based 3D city model generation and urban simulation for cities worldwide. VoxCity&amp;rsquo;s generator module automatically downloads building heights, tree canopy heights, land cover, and terrain elevation within a specified target area, and voxelizes buildings, trees, land cover, and terrain to generate an integrated voxel city model. The simulator module enables users to conduct environmental simulations, including solar radiation and view index analyses. Users can export the generated models using several file formats compatible with external software, such as ENVI-met (INX), Blender, and Rhino (OBJ).&lt;/p&gt;
&lt;p&gt;The Github repository is available &lt;a href="https://github.com/kunifujiwara/VoxCity" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/2_hu_157bfcbcbf84e17e.webp 400w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/2_hu_1b1760fabe65fc93.webp 760w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/2_hu_dd24dafb611f7439.webp 1200w"
src="https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/2_hu_157bfcbcbf84e17e.webp"
width="760"
height="259"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A Python package for 3D city model generation and environment simulations worldwide.&lt;/li&gt;
&lt;li&gt;Data integration of building and canopy height, land cover, and terrain elevation.&lt;/li&gt;
&lt;li&gt;Voxelization of buildings, trees, land cover, and terrain for an integrated model.&lt;/li&gt;
&lt;li&gt;Built-in simulation functions for solar irradiance, view index, and landmark visibility.&lt;/li&gt;
&lt;li&gt;Export of files compatible with external software, e.g., ENVI-met, Blender and Rhino.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/3_hu_37e05012949caf53.webp 400w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/3_hu_6a8f81d1d6bbc54f.webp 760w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/3_hu_21d903fcaa1c6a56.webp 1200w"
src="https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/3_hu_37e05012949caf53.webp"
width="545"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/4_hu_6918541252516bbf.webp 400w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/4_hu_efb3f96398177b73.webp 760w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/4_hu_1521430851173d24.webp 1200w"
src="https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/4_hu_6918541252516bbf.webp"
width="760"
height="723"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/5_hu_1b968b3c00f2effa.webp 400w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/5_hu_edc3c8c6cbbdd96c.webp 760w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/5_hu_f45a1b70c36ec853.webp 1200w"
src="https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/5_hu_1b968b3c00f2effa.webp"
width="760"
height="558"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Three-dimensional urban environment simulation is a powerful tool for informed urban planning. However, the intensive manual effort required to prepare input 3D city models has hindered its widespread adoption. To address this challenge, we present VoxCity, an open-source Python package that provides a one-stop solution for grid-based 3D city model generation and urban environment simulation for cities worldwide. VoxCity’s ‘generator’ subpackage automatically downloads building heights, tree canopy heights, land cover, and terrain elevation within a specified target area, and voxelizes buildings, trees, land cover, and terrain to generate an integrated voxel city model. The ‘simulator’ subpackage enables users to conduct environmental simulations, including solar radiation and view index analyses. Users can export the generated models using several file formats compatible with external software, such as ENVI-met (INX), Blender, and Rhino (OBJ). We generated 3D city models for eight global cities, and demonstrated the calculation of solar irradiance, sky view index, and green view index. We also showcased microclimate simulation and 3D rendering visualization through ENVI-met and Rhino, respectively, through the file export function. Additionally, we reviewed openly available geospatial data to create guidelines to help users choose appropriate data sources depending on their target areas and purposes. VoxCity can significantly reduce the effort and time required for 3D city model preparation and promote the utilization of urban environment simulations. This contributes to more informed urban and architectural design that considers environmental impacts, and in turn, fosters sustainable and livable cities. VoxCity is released openly at &lt;a href="https://github.com/kunifujiwara/VoxCity" target="_blank" rel="noopener"&gt;https://github.com/kunifujiwara/VoxCity&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2026-ceus-voxcity/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2026-ceus-voxcity/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/page-one_hu_ba0ac4fe78f19479.webp 400w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/page-one_hu_62e5f663261dc130.webp 760w,
/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/page-one_hu_27d73972030de08f.webp 1200w"
src="https://ual.sg/post/2025/11/17/voxcity-a-one-stop-python-package-for-open-geospatial-data-integration-3d-city-model-generation-and-urban-environment-simulation/page-one_hu_ba0ac4fe78f19479.webp"
width="583"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2026_ceus_voxcity&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Fujiwara, Kunihiko and Tsurumi, Ryuta and Kiyono, Tomoki and Fan, Zicheng and Liang, Xiucheng and Lei, Binyu and Yap, Winston and Ito, Koichi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2025.102366}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102366}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{VoxCity: A seamless framework for open geospatial data integration, grid-based semantic 3D city model generation, and urban environment simulation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{123}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2026}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>OpenFACADES: An open framework for architectural caption and attribute data enrichment via street view imagery</title><link>https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/</link><pubDate>Sun, 02 Nov 2025 07:59:02 +0800</pubDate><guid>https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/logo-full_hu_c6ea42f91eda33c2.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/logo-full_hu_8e90e78db6b20557.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/logo-full_hu_dba8722f29e44aca.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/logo-full_hu_c6ea42f91eda33c2.webp"
width="760"
height="305"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;We are glad to introduce our latest open project: OpenFACADES, which was also published as a paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liang X, Xie J, Zhao T, Stouffs R, Biljecki F (2025): OpenFACADES: An open framework for architectural caption and attribute data enrichment via street view imagery. ISPRS Journal of Photogrammetry and Remote Sensing 230: 918-942. &lt;a href="https://doi.org/10.1016/j.isprsjprs.2025.10.014" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.isprsjprs.2025.10.014&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-ijprs-openfacades/2025-ijprs-openfacades.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;OpenFACADES is an open-source framework designed to enrich building profiles with objective attributes and semantic descriptors by leveraging multimodal crowdsourced data and large vision-language models. It provides tools for integrating diverse datasets, automating building facade detection, and generating detailed annotations at scale.&lt;/p&gt;
&lt;p&gt;The Github repository is available &lt;a href="https://github.com/seshing/OpenFACADES" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/1_hu_2c86370938ea313c.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/1_hu_99732d791db00a6a.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/1_hu_df7f1e347a5c9cde.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/1_hu_2c86370938ea313c.webp"
width="760"
height="560"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/2_hu_c4e71a81246120c.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/2_hu_3d2e1d99b13e260d.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/2_hu_849854ac3db52536.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/2_hu_c4e71a81246120c.webp"
width="760"
height="388"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Method for acquiring and geolocating holistic building facades.&lt;/li&gt;
&lt;li&gt;Free and open-source pipeline integrating multimodal crowdsourced data.&lt;/li&gt;
&lt;li&gt;Baseline VLMs enable multi-task building facade profiling.&lt;/li&gt;
&lt;li&gt;In-depth discussion of VLMs’ domain-specific robustness and adaptability.&lt;/li&gt;
&lt;li&gt;Half a million buildings from 7 global cities labeled with attributes and captions.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/3_hu_ac15b781f45f1a02.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/3_hu_4d7178e5bdb13466.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/3_hu_d560643ab5835e89.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/3_hu_ac15b781f45f1a02.webp"
width="760"
height="314"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/4_hu_9e3c5f271a232e11.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/4_hu_e346c156da2a08bf.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/4_hu_c1daf381b02457ac.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/4_hu_9e3c5f271a232e11.webp"
width="760"
height="458"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/5_hu_632e2265eee7ef23.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/5_hu_1de9aadb130cf930.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/5_hu_dbc9d603bc5b5800.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/5_hu_632e2265eee7ef23.webp"
width="760"
height="460"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/6_hu_b07576b40b32e4d6.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/6_hu_a0e6657acd7bf8e2.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/6_hu_fb40ae6c83615a97.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/6_hu_b07576b40b32e4d6.webp"
width="662"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Building properties, such as height, usage, and material, play a crucial role in spatial data infrastructures, supporting various urban applications. Despite their importance, comprehensive building attribute data remain scarce in many urban areas. Recent advances have enabled the extraction of objective building attributes using remote sensing and street-level imagery. However, establishing a pipeline that integrates diverse open datasets, acquires holistic building imagery, and infers comprehensive building attributes at scale remains a significant challenge. Among the first, this study bridges the gaps by introducing OpenFACADES, an open framework that leverages multimodal crowdsourced data to enrich building profiles with both objective attributes and semantic descriptors through multimodal large language models. First, we integrate street-level image metadata from Mapillary with OpenStreetMap geometries via isovist analysis, identifying images that provide suitable vantage points for observing target buildings. Second, we automate the detection of building facades in panoramic imagery and tailor a reprojection approach to convert objects into holistic perspective views that approximate real-world observation. Third, we introduce an innovative approach that harnesses and investigates the capabilities of open-source large vision-language models (VLMs) for multi-attribute prediction and open-vocabulary captioning in building-level analytics, leveraging a globally sourced dataset of 31,180 labeled images from seven cities. Evaluation shows that fine-tuned VLM excel in multi-attribute inference, outperforming single-attribute computer vision models and zero-shot ChatGPT-4o. Further experiments confirm its superior generalization and robustness across culturally distinct region and varying image conditions. Finally, the model is applied for large-scale building annotation, generating a dataset of 1.2 million images for half a million buildings. This open‐source framework enhances the scope, adaptability, and granularity of building‐level assessments, enabling more fine‐grained and interpretable insights into the built environment. Our dataset and code are available openly at: &lt;a href="https://github.com/seshing/OpenFACADES" target="_blank" rel="noopener"&gt;https://github.com/seshing/OpenFACADES&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-ijprs-openfacades/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-ijprs-openfacades/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/page-one_hu_9e4a8377d7e35688.webp 400w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/page-one_hu_fa15c8c50663acd9.webp 760w,
/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/page-one_hu_19a88e9ced950fbf.webp 1200w"
src="https://ual.sg/post/2025/11/02/openfacades-an-open-framework-for-architectural-caption-and-attribute-data-enrichment-via-street-view-imagery/page-one_hu_9e4a8377d7e35688.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_ijprs_openfacades&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liang, Xiucheng and Xie, Jinheng and Zhao, Tianhong and Stouffs, Rudi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.isprsjprs.2025.10.014}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Journal of Photogrammetry and Remote Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{918--942}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{OpenFACADES: An open framework for architectural caption and attribute data enrichment via street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{230}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Introducing SPECS and its paper in Nature Cities</title><link>https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/</link><pubDate>Fri, 31 Oct 2025 18:32:04 +0800</pubDate><guid>https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/</guid><description>&lt;p&gt;We are glad to introduce our latest open project: SPECS, which was also published as a paper.
SPECS is a big dataset on visual perception that we have collected to understand how demographics and personality drive the visual perception of streetscapes.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Quintana M, Gu Y, Liang X, Hou Y, Ito K, Zhu Y, Abdelrahman M, Biljecki F (2025): Global urban visual perception varies across demographics and personalities. Nature Cities 2(11): 1092-1106. &lt;a href="https://doi.org/10.1038/s44284-025-00330-x" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s44284-025-00330-x&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-natcities-specs/2025-natcities-specs.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/matias-quintana/"&gt;Matias Quintana&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The study included surveying 1000 people from around the world (Chile, the Netherlands, Nigeria, Singapore, and the USA).
We examined how demographics (gender, age, income, education, race and ethnicity, and, for the first time, personality traits) shape perceptions among these one thousand participants with balanced demographics.
The paper asserts the importance of multi-city and multi-population analysis and it is packed with findings, e.g. that safety perceptions, in specific locations, are explained by gender.
We also showed the need for more tuned models rather than one-size-fits-all models by comparing state-of-the-art perception predictions against our dataset.&lt;/p&gt;
&lt;p&gt;We are happy to release this global and, participants-wise, demographically balanced urban visual perception dataset openly.
The Github repository is available &lt;a href="https://github.com/matqr/specs" target="_blank" rel="noopener"&gt;here&lt;/a&gt;, while the dataset can be downloaded from &lt;a href="https://huggingface.co/datasets/matiasqr/specs" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/1_hu_89e2e7642fd37978.webp 400w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/1_hu_afce1a106bf4372f.webp 760w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/1_hu_cfd5933d6c447198.webp 1200w"
src="https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/1_hu_89e2e7642fd37978.webp"
width="760"
height="586"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/2_hu_1c7179b376037cef.webp 400w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/2_hu_e549598cf15e17aa.webp 760w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/2_hu_e581fafbcf188c5c.webp 1200w"
src="https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/2_hu_1c7179b376037cef.webp"
width="760"
height="630"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/3_hu_9fb0b66b704effb0.webp 400w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/3_hu_6dd7d67fb13e7471.webp 760w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/3_hu_f77a353ed9e63034.webp 1200w"
src="https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/3_hu_9fb0b66b704effb0.webp"
width="582"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Understanding people’s preferences is crucial for urban planning, yet current approaches often combine responses from multi-cultural populations, obscuring demographic differences and risking amplifying biases. We conducted a large-scale urban visual perception survey of streetscapes worldwide using street view imagery, examining how demographics—including gender, age, income, education, race and ethnicity, and personality traits—shape perceptions among 1,000 participants with balanced demographics from five countries and 45 nationalities. This dataset, Street Perception Evaluation Considering Socioeconomics, reveals demographic- and personality-based differences across six traditional indicators—safe, lively, wealthy, beautiful, boring, depressing—and four new ones: live nearby, walk, cycle, green. Location-based sentiments further shape these preferences. Machine-learning models trained on existing global datasets tend to overestimate positive indicators and underestimate negative ones compared to human responses, underscoring the need for local context. Our study aspires to rectify the myopic treatment of street perception, which rarely considers demographics or personality traits.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-natcities-specs/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-natcities-specs/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/page-one_hu_6ae0fe8ff38f3381.webp 400w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/page-one_hu_f03642bb6eaffbfa.webp 760w,
/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/page-one_hu_54c594fb85bd8f5.webp 1200w"
src="https://ual.sg/post/2025/10/31/introducing-specs-and-its-paper-in-nature-cities/page-one_hu_6ae0fe8ff38f3381.webp"
width="563"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_natcities_specs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Quintana, Matias and Gu, Youlong and Liang, Xiucheng and Hou, Yujun and Ito, Koichi and Zhu, Yihan and Abdelrahman, Mahmoud and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s44284-025-00330-x}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Nature Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{11}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1092--1106}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Global urban visual perception varies across demographics and personalities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>We have our first doctor: Binyu Lei!</title><link>https://ual.sg/post/2025/10/18/we-have-our-first-doctor-binyu-lei/</link><pubDate>Sat, 18 Oct 2025 23:19:32 +0800</pubDate><guid>https://ual.sg/post/2025/10/18/we-have-our-first-doctor-binyu-lei/</guid><description>&lt;p&gt;On 14 October 2025, &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; has defended her PhD thesis &lt;em&gt;Urban Digital Twins: From Conceptualisation To Adoption Through A Human-Centric Lens&lt;/em&gt;.
Congratulations!&lt;/p&gt;
&lt;p&gt;This is not only a great milestone for Binyu but also for our Lab: she is the first PhD graduate from our research group.
Binyu has started her doctoral studies in 2021, following her graduation as Master in Urban Planning from the University of Melbourne and industry experience.&lt;/p&gt;
&lt;p&gt;During her PhD, she has advanced the role of humans in digital twins through a series of contributions such as integration of human-centric information.
Binyu&amp;rsquo;s work was published in leading journals, such as &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; and &lt;em&gt;Automation in Construction&lt;/em&gt;, and one of the chapters of her thesis was awarded the best paper award at 3D GeoInfo 2023.
During her PhD, she has been invited to give talks at other institutions and has contributed to activities of the Open Geospatial Consortium (OGC) and Climate Change AI (CCAI).&lt;/p&gt;
&lt;p&gt;The committee members were Adrian Chong, Chaewon Ahn, Eddie Lau, and Filip Biljecki (thesis advisor).&lt;/p&gt;
&lt;p&gt;She will continue her career as Assistant Professor in Urban Planning and Data Analytics at the School of Geography, Earth and Environmental Sciences, University of Birmingham Dubai.
We wish her all the best and lots of continued successes, and we thank her for the collaboration in the past years.
Binyu has greatly contributed to our research group and has helped shape its research agenda.&lt;/p&gt;
&lt;p&gt;To learn more about Binyu&amp;rsquo;s work, visit &lt;a href="https://binyulei.github.io/" target="_blank" rel="noopener"&gt;her website&lt;/a&gt; and &lt;a href="https://scholar.google.com.sg/citations?hl=en&amp;amp;user=0i0BL_0AAAAJ" target="_blank" rel="noopener"&gt;her Google Scholar profile&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Short abstract of her thesis:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Digital twins, as a means of innovative technology, have gained popularity in tackling urban issues and supporting decision-making in cities. This innovation advances urban analytics in a variety of domains, such as energy consumption calculation, what-if scenario simulation, or planning regulations evaluation. However, current discourse on urban digital twins often reflects a technology-optimism bias, while overlooking the human and social dimensions that are essential to a healthy, liveable, and sustainable city. This thesis revisits urban digital twins through a human-centric lens, aligning with a shift of socio-technical paradigm in the state of the art. This thesis examines the lifecycle of urban digital twins from conceptualisation to socially relevant practices, organised by a research framework incorporating human perspectives. In parallel, new methods and frameworks are developed to integrate human-related information, subjective perception, and interaction into the operation of urban digital twins.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;The thesis contributes to the research landscape in various ways: (1) identifying lifecycle challenges and advancing the socio-technical paradigm of urban digital twins, (2) generating a holistic benchmark of the properties of 3D city models in urban digital twins, (3) enriching semantic and perceptual information in urban digital twins through through crowdsourced and human generated data, (4) leveraging humans as sensors to complete the information loop in urban digital twins, and (5) demonstrating how human-centric urban digital twins can support socially relevant applications, including the dynamic quality of life and outdoor comfort. These insights not only challenge the prevailing narrative of urban digital twins as purely data-centric technology but also pave the way for multidisciplinary practices in the broader discussion, fostering urban sustainability, resilience, and public health and well-being.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/10/18/we-have-our-first-doctor-binyu-lei/2025_Binyu_2_hu_c1331351dcb4cf2e.webp 400w,
/post/2025/10/18/we-have-our-first-doctor-binyu-lei/2025_Binyu_2_hu_44a551871fe1c3b7.webp 760w,
/post/2025/10/18/we-have-our-first-doctor-binyu-lei/2025_Binyu_2_hu_3dd18acff33f13d3.webp 1200w"
src="https://ual.sg/post/2025/10/18/we-have-our-first-doctor-binyu-lei/2025_Binyu_2_hu_c1331351dcb4cf2e.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Want to join us in your PhD journey? Read &lt;a href="https://ual.sg/opportunities/application-guide/"&gt;this guide&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Paper in Nature Sustainability</title><link>https://ual.sg/post/2025/09/10/paper-in-nature-sustainability/</link><pubDate>Wed, 10 Sep 2025 14:32:22 +0800</pubDate><guid>https://ual.sg/post/2025/09/10/paper-in-nature-sustainability/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Wu AN, Miller C, Biljecki F (2025): Revealing building operating carbon dynamics for multiple cities. Nature Sustainability 8(10): 1199-1210. &lt;a href="https://doi.org/10.1038/s41893-025-01615-8" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41893-025-01615-8&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-natsus-revealing/2025-natsus-revealing.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper was also &lt;a href="https://cde.nus.edu.sg/news-detail/ai-model-maps-building-emissions-to-support-fairer-climate-policies/" target="_blank" rel="noopener"&gt;featured&lt;/a&gt; by our NUS College of Design and Engineering in &lt;a href="https://www.youtube.com/watch?v=rushehhq_iI" target="_blank" rel="noopener"&gt;a video&lt;/a&gt;:&lt;/p&gt;
&lt;div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;"&gt;
&lt;iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share; fullscreen" loading="eager" referrerpolicy="strict-origin-when-cross-origin" src="https://www.youtube.com/embed/rushehhq_iI?autoplay=0&amp;amp;controls=1&amp;amp;end=0&amp;amp;loop=0&amp;amp;mute=0&amp;amp;start=0" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" title="YouTube video"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/09/10/paper-in-nature-sustainability/1_hu_a3a92e8342dc65c5.webp 400w,
/post/2025/09/10/paper-in-nature-sustainability/1_hu_74dfcce7f365c16b.webp 760w,
/post/2025/09/10/paper-in-nature-sustainability/1_hu_16ce9301e1984ea4.webp 1200w"
src="https://ual.sg/post/2025/09/10/paper-in-nature-sustainability/1_hu_a3a92e8342dc65c5.webp"
width="760"
height="595"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/09/10/paper-in-nature-sustainability/2_hu_d7631de71c68b66e.webp 400w,
/post/2025/09/10/paper-in-nature-sustainability/2_hu_f90ba2b601510552.webp 760w,
/post/2025/09/10/paper-in-nature-sustainability/2_hu_c99862a482f7f83c.webp 1200w"
src="https://ual.sg/post/2025/09/10/paper-in-nature-sustainability/2_hu_d7631de71c68b66e.webp"
width="760"
height="659"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/09/10/paper-in-nature-sustainability/3_hu_85fa55d4ad7a4a3c.webp 400w,
/post/2025/09/10/paper-in-nature-sustainability/3_hu_c8b2100faa8eb8b1.webp 760w,
/post/2025/09/10/paper-in-nature-sustainability/3_hu_257d58b78dad95bd.webp 1200w"
src="https://ual.sg/post/2025/09/10/paper-in-nature-sustainability/3_hu_85fa55d4ad7a4a3c.webp"
width="760"
height="575"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Achieving carbon neutrality is a critical yet elusive goal for many cities, hindered by limited understanding of the relationship between building emissions and their surroundings. To address this challenge, we present a generalizable open science framework that integrates building energy-consumption data, multi-modal geospatial inputs and graph deep learning to quantify building operating emissions and their links to urban form and socio-economic factors. Applying this approach to five cities with diverse climates and planning contexts—Melbourne, New York City (Manhattan), Seattle, Singapore and Washington DC—we demonstrate that our models explain 78.4% of the variation in building operating carbon emissions across cities, achieving state-of-the-art accuracy for urban-scale energy modelling. Our findings reveal strong connections between a city’s planning history and its building carbon profile, alongside stark inequalities where wealthier areas often exhibit the highest per capita emissions. Additionally, the relationship between urban density and building emissions is complex and city specific, with emissions extending beyond dense urban cores into suburban areas. To design effective decarbonization strategies, cities must consider how their planning histories, urban layouts and economic conditions shape current emissions patterns.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-natsus-revealing/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-natsus-revealing/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/09/10/paper-in-nature-sustainability/page-one_hu_d0b7f34be9101358.webp 400w,
/post/2025/09/10/paper-in-nature-sustainability/page-one_hu_5dc176695415b597.webp 760w,
/post/2025/09/10/paper-in-nature-sustainability/page-one_hu_41c1c636c33f88d4.webp 1200w"
src="https://ual.sg/post/2025/09/10/paper-in-nature-sustainability/page-one_hu_d0b7f34be9101358.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_natsus_revealing&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yap, Winston and Wu, Abraham Noah and Miller, Clayton and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s41893-025-01615-8}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Nature Sustainability}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Revealing building operating carbon dynamics for multiple cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1199-1210}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{8}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Introducing SP-Survey, an open-source platform for streetscape perception research</title><link>https://ual.sg/post/2025/09/09/introducing-sp-survey-an-open-source-platform-for-streetscape-perception-research/</link><pubDate>Tue, 09 Sep 2025 20:19:32 +0800</pubDate><guid>https://ual.sg/post/2025/09/09/introducing-sp-survey-an-open-source-platform-for-streetscape-perception-research/</guid><description>&lt;p&gt;We are glad to announce that our &lt;a href="https://ual.sg/author/sijie-yang/"&gt;Sijie Yang&lt;/a&gt; has developed an open-source platform for streetscape perception research.
SP-Survey is a simple and powerful platform for conducting streetscape perception surveys with image-based questions.
It can be deployed in minutes.
Check it out on &lt;a href="https://github.com/Sijie-Yang/Streetscape-Perception-Survey" target="_blank" rel="noopener"&gt;Github&lt;/a&gt;!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/09/09/introducing-sp-survey-an-open-source-platform-for-streetscape-perception-research/1_hu_6e24e0fb081f06f5.webp 400w,
/post/2025/09/09/introducing-sp-survey-an-open-source-platform-for-streetscape-perception-research/1_hu_17e88373b8e75cad.webp 760w,
/post/2025/09/09/introducing-sp-survey-an-open-source-platform-for-streetscape-perception-research/1_hu_79405907fcb401a4.webp 1200w"
src="https://ual.sg/post/2025/09/09/introducing-sp-survey-an-open-source-platform-for-streetscape-perception-research/1_hu_6e24e0fb081f06f5.webp"
width="705"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;If you use this platform, please cite &lt;a href="https://ual.sg/publication/2025-bae-thermal/"&gt;the paper&lt;/a&gt;:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_bae_thermal&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yang, Sijie and Chong, Adrian and Liu, Pengyuan and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2025.112569}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112569}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Thermal comfort in sight: Thermal affordance and its visual assessment for sustainable streetscape design}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{271}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Appointments in leading journals</title><link>https://ual.sg/post/2025/08/06/appointments-in-leading-journals/</link><pubDate>Wed, 06 Aug 2025 20:19:32 +0800</pubDate><guid>https://ual.sg/post/2025/08/06/appointments-in-leading-journals/</guid><description>&lt;p&gt;The PI of the NUS Urban Analytics Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; was appointed in associate editor roles in two leading journals &amp;ndash; &lt;a href="https://www.sciencedirect.com/journal/computers-environment-and-urban-systems" target="_blank" rel="noopener"&gt;Computers, Environment and Urban Systems (CEUS)&lt;/a&gt; and &lt;a href="https://www.sciencedirect.com/journal/landscape-and-urban-planning" target="_blank" rel="noopener"&gt;Landscape and Urban Planning&lt;/a&gt;, both top 1% journals in their categories.&lt;/p&gt;
&lt;p&gt;This set of appointments is also a recognition of our research group, which we highly appreciate.&lt;/p&gt;
&lt;p&gt;These two journals have been a great inspiration to our Lab thanks to their authors, reviewers, and editors.
Thank you.
This is a great opportunity to pay that forward.&lt;/p&gt;</description></item><item><title>New paper: Bi-directional mapping of morphology metrics and 3D city blocks for enhanced characterisation and generation of urban form</title><link>https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/</link><pubDate>Sat, 07 Jun 2025 15:37:00 +0800</pubDate><guid>https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Cai C, Li B, Zhang Q, Wang X, Biljecki F, Herthogs P (2025): Bi-directional mapping of morphology metrics and 3D city blocks for enhanced characterisation and generation of urban form. Sustainable Cities and Society 129: 106441. &lt;a href="https://doi.org/10.1016/j.scs.2025.106441" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.scs.2025.106441&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-scs-bidirectional/2025-scs-bidirectional.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/chenyi-cai/"&gt;Chenyi Cai&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://doi.org/10.1016/j.scs.2025.106441" target="_blank" rel="noopener"&gt;available open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/1_hu_168099c5f58b71de.webp 400w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/1_hu_5a84ce65057e1351.webp 760w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/1_hu_63489b4dece2721.webp 1200w"
src="https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/1_hu_168099c5f58b71de.webp"
width="760"
height="422"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Enhanced morphology metrics are proposed to characterise block-scale 3D urban form.&lt;/li&gt;
&lt;li&gt;The introduced metric set integrates UMIs for buildings and performance evaluation.&lt;/li&gt;
&lt;li&gt;Systematic workflows retrieve diverse 3D block models using metrics.&lt;/li&gt;
&lt;li&gt;A novel bi-directional mapping between urban form and morphology metrics is built.&lt;/li&gt;
&lt;li&gt;This study advances performance-driven urban form generation and optimisation in CUD.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/2_hu_b5c5120a39745eb.webp 400w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/2_hu_b84e58371cfddb52.webp 760w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/2_hu_6d26f263baa74ca1.webp 1200w"
src="https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/2_hu_b5c5120a39745eb.webp"
width="760"
height="373"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/3_hu_3088c3bbcdb76c47.webp 400w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/3_hu_a2c134df45eb1818.webp 760w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/3_hu_59c501140be947e3.webp 1200w"
src="https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/3_hu_3088c3bbcdb76c47.webp"
width="760"
height="374"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/4_hu_24b0d5126460e14a.webp 400w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/4_hu_60421158c7201a60.webp 760w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/4_hu_ed572402fbccfb04.webp 1200w"
src="https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/4_hu_24b0d5126460e14a.webp"
width="760"
height="605"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Urban morphology examines city spatial configurations and plays a pivotal role in urban design and sustainability. Morphology metrics are essential for performance-driven computational urban design (CUD), which integrates the automatic generation of urban form and the evaluation and optimisation of urban performance. Although form generation and performance evaluation both rely on morphology metrics (e.g., floor area ratio), they are rarely unified into one workflow. Typically, form generation methods follow one-directional metric-to-form logic, whereas performance evaluation methods adopt the inverse form-to-metric logic. As a result, morphology metrics are often used in isolation within each method, limiting their applicability across both processes. To address this gap, approaches that can support bi-directional workflows, namely, simultaneous form-to-metric and metric–to–form, have the potential to combine and exchange results from both sides. The methodology introduced in this paper, which we refer to as bi-directional mapping, enables the formulation of sets of morphology metrics derived from form and then enable metric-to-form translation. We present approaches to formulate metric sets composed of indicators related to urban form and performance to characterise complex urban form and support performance evaluation. The metric sets can be derived from different cities, with 3D urban models of New York City as a demonstration in this study. Artificial neural networks are used to cluster 3D models and encode morphology metrics, enabling the generation of diverse urban models through case retrieval. Additionally, the effectiveness of the metrics in representing 3D city blocks is evaluated through comparative analysis. Our methodology identified metric sets that can comprehensively characterise 3D city blocks and enable effective retrieval for generating similar urban models. This improves performance-driven CUD towards sustainable urban design and planning.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-scs-bidirectional/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-scs-bidirectional/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/page-one_hu_b9d14087162ddfb8.webp 400w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/page-one_hu_35b7921aa3eb69ea.webp 760w,
/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/page-one_hu_74c5d6e3ed9de23a.webp 1200w"
src="https://ual.sg/post/2025/06/07/new-paper-bi-directional-mapping-of-morphology-metrics-and-3d-city-blocks-for-enhanced-characterisation-and-generation-of-urban-form/page-one_hu_b9d14087162ddfb8.webp"
width="568"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_scs_bidirectional&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cai, Chenyi and Li, Biao and Zhang, Qiyan and Wang, Xiao and Biljecki, Filip and Herthogs, Pieter}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.scs.2025.106441}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sustainable Cities and Society}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{106441}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Bi-directional mapping of morphology metrics and 3D city blocks for enhanced characterisation and generation of urban form}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{129}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Call for papers: Open Urban Data Science</title><link>https://ual.sg/post/2025/05/16/call-for-papers-open-urban-data-science/</link><pubDate>Fri, 16 May 2025 08:23:32 +0800</pubDate><guid>https://ual.sg/post/2025/05/16/call-for-papers-open-urban-data-science/</guid><description>&lt;p&gt;&lt;a href="https://www.sciencedirect.com/special-issue/322127/open-urban-data-science" target="_blank" rel="noopener"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/05/16/call-for-papers-open-urban-data-science/1_hu_3a25f721a1b4b0d1.webp 400w,
/post/2025/05/16/call-for-papers-open-urban-data-science/1_hu_4669c39d422b08b1.webp 760w,
/post/2025/05/16/call-for-papers-open-urban-data-science/1_hu_7d6749b721e47ccf.webp 1200w"
src="https://ual.sg/post/2025/05/16/call-for-papers-open-urban-data-science/1_hu_3a25f721a1b4b0d1.webp"
width="760"
height="537"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We are organising a unique special issue in &lt;a href="https://www.sciencedirect.com/journal/computers-environment-and-urban-systems" target="_blank" rel="noopener"&gt;CEUS&lt;/a&gt;.
The &lt;a href="https://www.sciencedirect.com/special-issue/322127/open-urban-data-science" target="_blank" rel="noopener"&gt;Call for Papers&lt;/a&gt; is copied below.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="open-urban-data-science"&gt;Open Urban Data Science&lt;/h2&gt;
&lt;p&gt;Submission deadline: 31 May 2026&lt;/p&gt;
&lt;h3 id="guest-editors"&gt;Guest editors:&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Filip Biljecki, National University of Singapore, Singapore&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Martin Fleischmann, Charles University, Czechia&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Hugo Ledoux, Delft University of Technology, the Netherlands&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Geoff Boeing, University of Southern California, United States of America&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="special-issue-information"&gt;Special issue information:&lt;/h3&gt;
&lt;p&gt;Computers, Environment and Urban Systems (CEUS) invites submissions for a special issue dedicated to Open Urban Data Science, with a particular focus on open software tools that enable innovative computer-based research on urban systems, systems of cities, and built and natural environments, emphasising the geospatial perspective. This special issue aims to recognise and promote the scholarly contributions represented by research software development at the urban scale.&lt;/p&gt;
&lt;p&gt;We welcome contributions addressing, but not limited to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Novel software tools for urban data collection, processing, visualisation, or analysis.&lt;/li&gt;
&lt;li&gt;Comprehensive platforms and frameworks that integrate multiple urban data sources.&lt;/li&gt;
&lt;li&gt;Methods for addressing privacy, equity, and ethics in urban data.&lt;/li&gt;
&lt;li&gt;Computational frameworks for urban simulation and modelling.&lt;/li&gt;
&lt;li&gt;Software for analysing urban mobility, morphology, accessibility, and transportation, and other domains relevant to the journal.&lt;/li&gt;
&lt;li&gt;Citizen science platforms for urban data collection and engagement.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Manuscripts should present original software tools, packages, or frameworks developed for urban data science applications. Submissions are expected to be comprehensive, including a literature review, information on software architecture, implementation details, availability, and elaborate examples demonstrating utility for research.&lt;/p&gt;
&lt;p&gt;In addition to software papers, regular types of articles that are related to open urban data science are also welcome provided they demonstrate a strong connection to the theme of the special issue.&lt;/p&gt;
&lt;h3 id="evaluation-criteria"&gt;Evaluation criteria:&lt;/h3&gt;
&lt;p&gt;Submissions will be evaluated based on their relevance and alignment with the journal&amp;rsquo;s scope, quality (e.g. robustness and reliability of the software), usability (e.g. comprehensive documentation, ease of use, and accessibility), significance (e.g. potential impact on research and practice), innovation (e.g. novelty of the software or methodological approach), and sustainability (e.g. maintenance, support, and adoption). The usual expectations from an academic article (e.g. including a state of the art literature review and articulation of scholarly contributions) apply.&lt;/p&gt;
&lt;p&gt;All submissions must adhere to open science principles. The software must be released under an appropriate open-source license, be permanently available, and be free of restrictions on use.&lt;/p&gt;
&lt;p&gt;For questions regarding this special issue, please contact the editors.&lt;/p&gt;
&lt;p&gt;For examples of relevant papers, please see the following past publications in CEUS and related journals.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Boeing, G. (2017). OSMnx: New methods for acquiring, constructing, analyzing, and visualizing complex street networks. Computers, Environment and Urban Systems, 65, 126–139. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2017.05.004" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.compenvurbsys.2017.05.004&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Biljecki, F., &amp;amp; Chow, Y. S. (2022). Global Building Morphology Indicators. Computers, Environment and Urban Systems, 95, 101809. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2022.101809" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.compenvurbsys.2022.101809&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Danish, M., Labib, S., Ricker, B., &amp;amp; Helbich, M. (2024). A citizen science toolkit to collect human perceptions of urban environments using open street view images. Computers, Environment and Urban Systems, 116, 102207. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2024.102207" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.compenvurbsys.2024.102207&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Félix, R., Moura, F., &amp;amp; Lovelace, R. (2024). Reproducible methods for modeling combined public transport and cycling trips and associated benefits: Evidence from the biclaR tool. Computers, Environment and Urban Systems, 117, 102230. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2024.102230" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.compenvurbsys.2024.102230&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Ito et al. (2025). ZenSVI: An open-source software for the integrated acquisition, processing and analysis of street view imagery towards scalable urban science. Computers, Environment and Urban Systems, 119: 102283. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102283" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.compenvurbsys.2025.102283&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Mahajan, S. (2024). greenR: An open-source framework for quantifying urban greenness. Ecological Indicators, 163, 112108. &lt;a href="https://doi.org/10.1016/j.ecolind.2024.112108" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.ecolind.2024.112108&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Sevtsuk, A., &amp;amp; Alhassan, A. (2025). Madina Python package: Scalable urban network analysis for modeling pedestrian and bicycle trips in cities. Journal of Transport Geography, 123, 104130. &lt;a href="https://doi.org/10.1016/j.jtrangeo.2025.104130" target="_blank" rel="noopener"&gt;https://doi.org/10.1016/j.jtrangeo.2025.104130&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="timeline"&gt;Timeline:&lt;/h3&gt;
&lt;p&gt;Submissions are accepted until 31 May 2026.&lt;/p&gt;
&lt;p&gt;Articles will be published on a rolling basis immediately after acceptance and production. The special issue containing an editorial will be released after all submissions are processed, sometime in 2027.&lt;/p&gt;
&lt;p&gt;Software papers do not need to be anonymised and shall contain links to public repositories.&lt;/p&gt;
&lt;p&gt;The journal’s submission platform (&lt;a href="https://www.editorialmanager.com/ceus/default.aspx" target="_blank" rel="noopener"&gt;Editorial Manager&lt;/a&gt;) is now available for receiving submissions to this Special Issue. Please select the article type of “VSI: Open Urban Data Science” when submitting your manuscript online. The submission portal could be found on the Journal Homepage &lt;a href="https://www.sciencedirect.com/journal/computers-environment-and-urban-systems" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.elsevier.com/researcher/author/submit-your-paper/special-issues/special-issue-invitation-faqs" target="_blank" rel="noopener"&gt;Check out the FAQs on special issues&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.elsevier.com/authors/submit-your-paper/special-issues" target="_blank" rel="noopener"&gt;Learn more about the benefits of publishing in a special issue&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>New paper: Quantifying seasonal bias in street view imagery for urban form assessment</title><link>https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/</link><pubDate>Sat, 10 May 2025 13:32:09 +0800</pubDate><guid>https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Zhao T, Liang X, Biljecki F, Tu W, Cao J, Li X, Yi S (2025): Quantifying seasonal bias in street view imagery for urban form assessment: A global analysis of 40 cities. Computers, Environment and Urban Systems 120: 102302. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102302" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2025.102302&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-ceus-svi-seasonality/2025-ceus-svi-seasonality.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102302" target="_blank" rel="noopener"&gt;available open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/1_hu_88f7a1e5a0b6c936.webp 400w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/1_hu_20afd8430adb6dfc.webp 760w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/1_hu_5cb20ff5b375b67c.webp 1200w"
src="https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/1_hu_88f7a1e5a0b6c936.webp"
width="760"
height="715"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;First framework for quantifying seasonal bias in street view imagery.&lt;/li&gt;
&lt;li&gt;Evaluates seasonal bias across 40 global cities.&lt;/li&gt;
&lt;li&gt;Identifies significant seasonal bias in high-latitude, low-rainfall cities.&lt;/li&gt;
&lt;li&gt;Examines uncertainty in urban form indices derived from street view data.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/2_hu_b072b450a012d081.webp 400w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/2_hu_9791eb4c011f7069.webp 760w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/2_hu_baaf84349b60494b.webp 1200w"
src="https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/2_hu_b072b450a012d081.webp"
width="760"
height="291"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/3_hu_47ca4d1cbdfea568.webp 400w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/3_hu_fd9f8e24dd4eb012.webp 760w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/3_hu_45ad057f52c97ba7.webp 1200w"
src="https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/3_hu_47ca4d1cbdfea568.webp"
width="760"
height="377"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/4_hu_523426038995262f.webp 400w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/4_hu_fd4205e654a8cae7.webp 760w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/4_hu_2c51aaedd29e18ef.webp 1200w"
src="https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/4_hu_523426038995262f.webp"
width="760"
height="436"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Street view imagery (SVI), with its rich visual information, is increasingly recognized as a valuable data source for urban research. Particularly, by leveraging computer vision techniques, SVI can be used to calculate various urban form indices (e.g., Green View Index, GVI), providing a new approach for large-scale quantitative assessments of urban environments. However, SVI data collected at the same location in different seasons can yield varying urban form indices due to phenological changes, even when the urban form remains constant. Numerous studies overlook this kind of seasonal bias. To address this gap, we propose a systematic analytical framework for quantifying and evaluating seasonal bias in SVI, drawing on more than 262,000 images from 40 cities worldwide. This framework encompasses three aspects: seasonal bias within urban areas, seasonal bias across cities on a global scale, and the impact of seasonal bias in practical applications. The results reveal that (1) seasonal bias is evident, with an average mean absolute percentage error (MAPE) of 54 % for GVI across all sampled cities, and it is particularly pronounced in areas with significant seasonal bias; (2) seasonal bias is strongly correlated with geographic location, with greater bias observed in cities with lower average rainfall and temperatures; and (3) in practical applications, ignoring seasonal bias may result in analytical errors (e.g., an ARI of 0.35 in clustering). By identifying and quantifying seasonal bias in SVI, this study contributes to improving the accuracy of urban environmental assessments based on street view data and provides new theoretical support for the broader application of such data on a global scale.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-ceus-svi-seasonality/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-ceus-svi-seasonality/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/page-one_hu_c936e3b3f3757569.webp 400w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/page-one_hu_d5969826250defe3.webp 760w,
/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/page-one_hu_b2f20fb5e664178e.webp 1200w"
src="https://ual.sg/post/2025/05/10/new-paper-quantifying-seasonal-bias-in-street-view-imagery-for-urban-form-assessment/page-one_hu_c936e3b3f3757569.webp"
width="558"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_ceus_svi_seasonality&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Zhao, Tianhong and Liang, Xiucheng and Biljecki, Filip and Tu, Wei and Cao, Jinzhou and Li, Xiaojiang and Yi, Shengao}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2025.102302}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102302}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Quantifying seasonal bias in street view imagery for urban form assessment: A global analysis of 40 cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{120}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>College Awards and Recognition ceremony</title><link>https://ual.sg/post/2025/04/30/college-awards-and-recognition-ceremony/</link><pubDate>Wed, 30 Apr 2025 14:11:31 +0800</pubDate><guid>https://ual.sg/post/2025/04/30/college-awards-and-recognition-ceremony/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/30/college-awards-and-recognition-ceremony/featured_hu_e9996ae319c63c4.webp 400w,
/post/2025/04/30/college-awards-and-recognition-ceremony/featured_hu_f5a64068f9367140.webp 760w,
/post/2025/04/30/college-awards-and-recognition-ceremony/featured_hu_65b0275d4f775cd5.webp 1200w"
src="https://ual.sg/post/2025/04/30/college-awards-and-recognition-ceremony/featured_hu_e9996ae319c63c4.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The PI of our research group, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, has been awarded the College Educator Award AY 2023/24 from the NUS College of Design &amp;amp; Engineering.&lt;/p&gt;
&lt;p&gt;This honour recognises faculty teaching excellence and dedication to student learning. A big thank you to our TAs &amp;ndash; &lt;a href="https://ual.sg/author/zicheng-fan/"&gt;Zicheng Fan&lt;/a&gt;, &lt;a href="https://ual.sg/author/yixin-wu/"&gt;Yixin Wu&lt;/a&gt;, &lt;a href="https://ual.sg/author/yihan-zhu/"&gt;Yihan Zhu&lt;/a&gt;, &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt; and &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt; &amp;ndash; for their invaluable support in our Lab&amp;rsquo;s courses, which are mostly conducted in the Master of Urban Planning programme at our Department of Architecture.&lt;/p&gt;
&lt;p&gt;Congratulations as well to all the other awardees, including colleagues from our sister labs!&lt;/p&gt;
&lt;p&gt;Read more &lt;a href="https://cde.nus.edu.sg/news-detail/faculty-honoured-at-cde-awards-and-recognition-ceremony/" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>New paper: Designing effective image-based surveys for urban visual perception</title><link>https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/</link><pubDate>Fri, 18 Apr 2025 14:32:22 +0800</pubDate><guid>https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Gu Y, Quintana M, Liang X, Ito K, Yap W, Biljecki F (2025): Designing effective image-based surveys for urban visual perception. Landscape and Urban Planning, 260: 105368. &lt;a href="https://doi.org/10.1016/j.landurbplan.2025.105368" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2025.105368&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-land-effective-surveys/2025-land-effective-surveys.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/youlong-gu/"&gt;Youlong Gu&lt;/a&gt;.
Congratulations on his first journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1kxswcUG5aKal" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-06-06.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/1_hu_cda6d4cab61aa197.webp 400w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/1_hu_b8cc2440b803de28.webp 760w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/1_hu_d929abdabedaa2d9.webp 1200w"
src="https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/1_hu_cda6d4cab61aa197.webp"
width="760"
height="484"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Perception surveys of urban visual landscapes are crucial in planning and design.&lt;/li&gt;
&lt;li&gt;They vary substantially and there is no standard ensuring validity and robustness.&lt;/li&gt;
&lt;li&gt;We established survey design guidelines supported by statistics and experiments.&lt;/li&gt;
&lt;li&gt;Robust results require &amp;gt;12 Likert ratings and &amp;gt;22 Pairwise Comparisons per image.&lt;/li&gt;
&lt;li&gt;Reporting protocol to communicate effectively the design and parameters of surveys.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/2_hu_bbccd07f1bda4be7.webp 400w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/2_hu_4488c06cd532739a.webp 760w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/2_hu_355b69ecc95a51f7.webp 1200w"
src="https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/2_hu_bbccd07f1bda4be7.webp"
width="760"
height="681"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/3_hu_c1f8147919c17ac4.webp 400w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/3_hu_e3760dbf83b497e7.webp 760w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/3_hu_3f424e1d9c42c2ee.webp 1200w"
src="https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/3_hu_c1f8147919c17ac4.webp"
width="760"
height="353"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/4_hu_6325cdbbb764f26c.webp 400w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/4_hu_22ed0881a962f12.webp 760w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/4_hu_6f4f77c7e0895848.webp 1200w"
src="https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/4_hu_6325cdbbb764f26c.webp"
width="718"
height="414"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Urban visual perception is important for the human experience in cities, shaped by intertwined characteristics of urban landscapes. By quantifying and explaining these perceptual experiences, researchers can gain insights into human preferences and support decision-making in planning and design. However, past studies have shown inconsistencies in survey design and ambiguities in reporting, leading to concerns about the reliability and reproducibility of results. This study proposes the first comprehensive framework to guide image-based survey design for capturing perceptions of outdoor urban environments across different scenarios, addressing the lack of methodological standardization in current research. We reviewed existing surveys to identify key parameters, conducted comprehensive between-subject and within-subject surveys, and performed statistical analyses to determine best practices for survey design across different contexts. Aiming to set a potential community standard, our study doubles as a blueprint for a reporting protocol for survey designs. Based on the results, we recommend: (1) meeting a minimum of 12 and 22 ratings per image for Likert Scale and Pairwise Comparison studies to reach survey reliability, respectively, and reporting these alongside other survey design parameters to enhance transparency and reproducibility; and (2) when resource allows larger experiments, adopt a ranking method such as Pairwise Comparison to achieve firmer rating results; and (3) using perspective (non-panoramic) images more frequently, as they exhibit comparable overall scores to panoramic images (R mostly &amp;gt;0.7), while being more widely available via crowdsourced sources, supporting their use in large-scale visual perception research.&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-land-effective-surveys/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-land-effective-surveys/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/page-one_hu_dbf3b4cc4fcd8eb.webp 400w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/page-one_hu_68df4fead8c63262.webp 760w,
/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/page-one_hu_fad71f2dd18745a4.webp 1200w"
src="https://ual.sg/post/2025/04/18/new-paper-designing-effective-image-based-surveys-for-urban-visual-perception/page-one_hu_dbf3b4cc4fcd8eb.webp"
width="590"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_land_effective_surveys&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Gu, Youlong and Quintana, Matias and Liang, Xiucheng and Ito, Koichi and Yap, Winston and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2025.105368}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105368}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Designing effective image-based surveys for urban visual perception}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{260}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Urban Aedes aegypti suitability indicators</title><link>https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/</link><pubDate>Thu, 17 Apr 2025 14:41:09 +0800</pubDate><guid>https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Knoblauch S, Mukaratirwa RT, Pimenta Paulo FP, de A Rocha A, Su YM, Randhawa S, Lautenbach S, Wilder-Smith A, Rocklöv J, Brady OJ, Biljecki F, Dambach P, Jänisch T, Resch B, Haddawy P, Bärnighausen T, Zipf A (2025): Urban Aedes aegypti suitability indicators: a study in Rio de Janeiro, Brazil. The Lancet Planetary Health 9(4): e264-e273. &lt;a href="https://doi.org/10.1016/S2542-5196%2825%2900049-X" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/S2542-5196(25)00049-X&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-tlph-aedes/2025-tlph-aedes.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://www.geog.uni-heidelberg.de/gis/knoblauch.html" target="_blank" rel="noopener"&gt;Steffen Knoblauch&lt;/a&gt; from the &lt;a href="https://www.geog.uni-heidelberg.de/gis/index_en.html" target="_blank" rel="noopener"&gt;GIScience Research Group&lt;/a&gt; at Heidelberg University in Germany.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://doi.org/10.1016/S2542-5196%2825%2900049-X" target="_blank" rel="noopener"&gt;available open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/0_hu_a0fa99985b4f956e.webp 400w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/0_hu_49c4ff3b32d371a9.webp 760w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/0_hu_25bb6a58df3a64b2.webp 1200w"
src="https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/0_hu_a0fa99985b4f956e.webp"
width="760"
height="280"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/1_hu_141a0def22c3fc97.webp 400w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/1_hu_136b75a1e15c25a1.webp 760w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/1_hu_6d1ac5f16216cc53.webp 1200w"
src="https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/1_hu_141a0def22c3fc97.webp"
width="584"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/2_hu_c8d70e8cfd2e840b.webp 400w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/2_hu_3f56f2ad8ac7d8f4.webp 760w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/2_hu_5bc48f68a87e4213.webp 1200w"
src="https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/2_hu_c8d70e8cfd2e840b.webp"
width="760"
height="659"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="summary"&gt;Summary&lt;/h3&gt;
&lt;h4 id="background"&gt;Background&lt;/h4&gt;
&lt;p&gt;Controlling Aedes aegypti stands as the primary strategy in curtailing the global threat of vector-borne viral infections such as dengue fever, which is responsible for around 400 million infections and 40 000 fatalities annually. Effective interventions require a precise understanding of Ae aegypti spatiotemporal distribution and behaviour, particularly in urban settings where most infections occur. However, conventionally applied sample-based entomological surveillance systems often fail to capture the high spatial variability of Ae aegypti that can arise from heterogeneous urban landscapes and restricted Aedes flight range.&lt;/p&gt;
&lt;h4 id="methods"&gt;Methods&lt;/h4&gt;
&lt;p&gt;In this study, we aimed to address the challenge of capturing the spatial variability of Ae aegypti by leveraging emerging geospatial big data, including openly available satellite and street view imagery, to locate common Ae aegypti breeding habitats. These data enabled us to infer the seasonal suitability for Ae aegypti eggs and larvae at a spatial resolution of 200 m within the municipality of Rio de Janeiro, Brazil.&lt;/p&gt;
&lt;h4 id="findings"&gt;Findings&lt;/h4&gt;
&lt;p&gt;The proposed microhabitat and macrohabitat indicators for immature Ae aegypti explained the distribution of Ae aegypti ovitrap egg counts by up to 72% (95% CI 70–74) and larval counts by up to 74% (72–76). Spatiotemporal interpolations of ovitrap counts, using suitability indicators, provided high-resolution insights into the spatial variability of urban immature Ae aegypti that could not be captured with sample-based surveillance techniques alone.&lt;/p&gt;
&lt;h4 id="interpretation"&gt;Interpretation&lt;/h4&gt;
&lt;p&gt;The potential of the proposed method lies in synergising entomological field measurements with digital indicators on urban landscape to guide vector control and address the prevailing spread of Ae aegypti-transmitted viruses. Estimating Ae aegypti distributions considering habitat size is particularly important for targeting novel vector control interventions such as Wolbachia.&lt;/p&gt;
&lt;h4 id="funding"&gt;Funding&lt;/h4&gt;
&lt;p&gt;German Research Foundation and Austrian Science Fund.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/3_hu_204e437c0a5d30a4.webp 400w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/3_hu_1f8d74260fcbd7a6.webp 760w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/3_hu_f057e9a29fd87020.webp 1200w"
src="https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/3_hu_204e437c0a5d30a4.webp"
width="760"
height="490"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-tlph-aedes/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-tlph-aedes/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/page-one_hu_6d2b95c2d7970bea.webp 400w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/page-one_hu_9689a6252acfbb5f.webp 760w,
/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/page-one_hu_dee83511b5f174c2.webp 1200w"
src="https://ual.sg/post/2025/04/17/new-paper-urban-aedes-aegypti-suitability-indicators/page-one_hu_6d2b95c2d7970bea.webp"
width="564"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_tlph_aedes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Knoblauch, Steffen and Mukaratirwa, Rutendo T and Pimenta, Paulo F P and de A Rocha, Ant{\^o}nio A and Yin, Myat Su and Randhawa, Sukanya and Lautenbach, Sven and Wilder-Smith, Annelies and Rockl{\&amp;#34;o}v, Joacim and Brady, Oliver J and Biljecki, Filip and Dambach, Peter and J{\&amp;#34;a}nisch, Thomas and Resch, Bernd and Haddawy, Peter and B{\&amp;#34;a}rnighausen, Till and Zipf, Alexander}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/s2542-5196(25)00049-x}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{The Lancet Planetary Health}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;month&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nv"&gt;apr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{4}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{e264--e273}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Urban Aedes aegypti suitability indicators: a study in Rio de Janeiro, Brazil}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Anniversary of the Lab</title><link>https://ual.sg/post/2025/04/16/anniversary-of-the-lab/</link><pubDate>Wed, 16 Apr 2025 17:31:31 +0800</pubDate><guid>https://ual.sg/post/2025/04/16/anniversary-of-the-lab/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/16/anniversary-of-the-lab/featured_hu_ccdea34b949ea2ef.webp 400w,
/post/2025/04/16/anniversary-of-the-lab/featured_hu_65298054921f874c.webp 760w,
/post/2025/04/16/anniversary-of-the-lab/featured_hu_bd74296cb7e867dc.webp 1200w"
src="https://ual.sg/post/2025/04/16/anniversary-of-the-lab/featured_hu_ccdea34b949ea2ef.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Time flies when you are surrounded by passionate and bright people.&lt;/p&gt;
&lt;p&gt;This month it has been 6 years since our research group started 🥼🧪🌏.
What began as a small effort has grown into a vibrant &amp;amp; diverse research group through which 80+ researchers from 18 countries, and across several disciplines, have passed, brought together by a shared curiosity for data &amp;amp; cities.&lt;/p&gt;
&lt;p&gt;While we could boast about achievements such as impactful papers and metrics, what we are most grateful for, and proud of, are the real outputs: meaningful collaborations, the career journeys we have helped shape, and the culture we have built.&lt;/p&gt;
&lt;p&gt;A huge thank you to all lab members past and present, our collaborators within and outside NUS &amp;amp; SG, the wider research community for their interest in our work, and everyone who has supported us, in ways big and small, especially our funders and the Department of Architecture, College of Design &amp;amp; Engineering, National University of Singapore where we are proudly based.&lt;/p&gt;
&lt;p&gt;We will keep doing our best to advance urban data science, enhance geospatial &amp;amp; urban informatics capacity building at NUS and beyond, and strengthen Singapore&amp;rsquo;s position in the global academic landscape of our field.&lt;/p&gt;
&lt;p&gt;We have some new exciting research developments to share soon.
Follow our work here our website (&lt;a href="https://ual.sg/post/index.xml"&gt;RSS feed&lt;/a&gt;) or on &lt;a href="https://www.linkedin.com/company/urban-analytics-lab/" target="_blank" rel="noopener"&gt;LinkedIn&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>New paper: Physical urban change and its socio-environmental impact</title><link>https://ual.sg/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/</link><pubDate>Tue, 01 Apr 2025 13:01:09 +0800</pubDate><guid>https://ual.sg/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liu Y, Wang Z, Ren S, Chen R, Shen Y, Biljecki F (2025): Physical urban change and its socio-environmental impact: Insights from street view imagery. Computers, Environment and Urban Systems 119: 102284. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102284" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2025.102284&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-ceus-urbanchange/2025-ceus-urbanchange.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://www.linkedin.com/in/yingjie-liu-318860211" target="_blank" rel="noopener"&gt;Yingjie Liu&lt;/a&gt; and &lt;a href="https://ual.sg/author/zeyu-wang/"&gt;Zeyu Wang&lt;/a&gt; from the &lt;a href="https://be.uw.edu/" target="_blank" rel="noopener"&gt;College of Built Environments at the University of Washington&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1kqiOjFQh4Mov" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-05-16.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/1_hu_9efbcc76f0dbc4d3.webp 400w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/1_hu_b9d403268165c4af.webp 760w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/1_hu_a8d6b932d60dbb17.webp 1200w"
src="https://ual.sg/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/1_hu_9efbcc76f0dbc4d3.webp"
width="760"
height="656"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h2 id="highlights"&gt;Highlights&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Investigating formal and informal urban transformation processes and evaluating its impact to public perception.&lt;/li&gt;
&lt;li&gt;Time series Street View Imagery for a detailed, human-scale view of urban physical change detection.&lt;/li&gt;
&lt;li&gt;Urban change detection by combining two machine learning models.&lt;/li&gt;
&lt;li&gt;Examining cities of varying sizes and characteristics to gain a comprehensive understanding of urban transformation.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/2_hu_229da789a7992b03.webp 400w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/2_hu_c1f15da365011d7.webp 760w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/2_hu_33794096d2f3b124.webp 1200w"
src="https://ual.sg/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/2_hu_229da789a7992b03.webp"
width="760"
height="344"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;Urban transformation not only reshapes physical spaces but also impacts public perception, influencing how people experience their environments. This study utilizes Street View Imagery (SVI) as an emerging, human-level data source to assess urban changes, providing perspective beyond traditional datasets. Existing studies often focus on either urban physical changes or human perception changes, without bridging the two. This research integrates both aspects by combining a change detection model, trained on a self-labeled dataset, and a human perception model based on the crowdsourced Place Pulse 2.0 dataset with input from 81,630 online volunteers, to analyze urban transformation in New York City and Memphis from 2007 to 2023. Our findings reveal differences between the two cities: New York City exhibited small, isolated changes often driven by community needs, while Memphis transitioned from concentrated to more dispersed development patterns. This study provides insights into how physical changes influence public perception within these two cities. It demonstrates how thoughtful, well-planned urban transformation can improve neighborhood&amp;rsquo;s perception such as safety and livability, while also pointing out potential challenges like gentrification or social fragmentation. These findings provide policymakers with valuable insights into human perception, aiding in the creation of more inclusive, vibrant, and resilient urban transformation. This helps ensure that urban transformation efforts are based on community desires and align with long-term sustainability goals.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/3_hu_acfb5ffa8ccd2c9a.webp 400w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/3_hu_22f1352496288005.webp 760w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/3_hu_4e4797e9f444d2ad.webp 1200w"
src="https://ual.sg/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/3_hu_acfb5ffa8ccd2c9a.webp"
width="760"
height="631"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-ceus-urbanchange/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-ceus-urbanchange/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/page-one_hu_2ef7ced45a26d600.webp 400w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/page-one_hu_b4e66f4425512029.webp 760w,
/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/page-one_hu_6767393564b60d92.webp 1200w"
src="https://ual.sg/post/2025/04/01/new-paper-physical-urban-change-and-its-socio-environmental-impact/page-one_hu_2ef7ced45a26d600.webp"
width="567"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_ceus_urbanchange&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liu, Yingjie and Wang, Zeyu and Ren, Siyi and Chen, Runying and Shen, Yixiang and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2025.102284}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102284}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Physical urban change and its socio-environmental impact: Insights from street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{119}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Introducing ZenSVI</title><link>https://ual.sg/post/2025/03/23/introducing-zensvi/</link><pubDate>Sun, 23 Mar 2025 12:01:37 +0800</pubDate><guid>https://ual.sg/post/2025/03/23/introducing-zensvi/</guid><description>&lt;p&gt;We are excited to announce our project &lt;a href="https://github.com/koito19960406/ZenSVI" target="_blank" rel="noopener"&gt;&lt;em&gt;ZenSVI&lt;/em&gt;&lt;/a&gt;!&lt;/p&gt;
&lt;p&gt;It is an open-source Python package to streamline your projects relying on Street View Imagery (SVI), from their download to analysis.
It improves the transparency, reproducibility, and scalability of such research and supports researchers in conducting urban analyses efficiently.
Its modular design facilitates extensions of the package for new use cases.
A comprehensive paper about the project is published as a namesake &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102283" target="_blank" rel="noopener"&gt;article&lt;/a&gt; in the &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Ito K, Zhu Y, Abdelrahman M, Liang X, Fan Z, Hou Y, Zhao T, Ma R, Fujiwara K, Ouyang J, Quintana M, Biljecki F (2025): ZenSVI: An open-source software for the integrated acquisition, processing and analysis of street view imagery towards scalable urban science. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 119: 102283.
&lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102283" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt;10.1016/j.compenvurbsys.2025.102283&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-ceus-zensvi/2025-ceus-zensvi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/1_hu_d325839866c935c6.webp 400w,
/post/2025/03/23/introducing-zensvi/1_hu_54ee2f3d91a9de48.webp 760w,
/post/2025/03/23/introducing-zensvi/1_hu_b78119291c676bbb.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/1_hu_d325839866c935c6.webp"
width="760"
height="561"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The project was led by &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;, and it was carried out in a large collaboration within our research group.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/2_hu_508aea0f4af32ce3.webp 400w,
/post/2025/03/23/introducing-zensvi/2_hu_2da61deca68b25ec.webp 760w,
/post/2025/03/23/introducing-zensvi/2_hu_255b296c6897ab69.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/2_hu_508aea0f4af32ce3.webp"
width="760"
height="566"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Our software improves the transparency, reproducibility, and scalability of research relying on SVI and supports researchers in conducting urban analyses efficiently. Its modular design facilitates extensions of the package for new use cases. This package is openly available on &lt;a href="https://github.com/koito19960406/ZenSVI" target="_blank" rel="noopener"&gt;GitHub&lt;/a&gt;, and it is supported by &lt;a href="https://zensvi.readthedocs.io/en/latest/index.html" target="_blank" rel="noopener"&gt;documentation including tutorials&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1kn-6jFQh4Moj" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-05-09.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/3_hu_8ea5ab9339378f5.webp 400w,
/post/2025/03/23/introducing-zensvi/3_hu_5b3cad234a9ffefb.webp 760w,
/post/2025/03/23/introducing-zensvi/3_hu_b11307eb9f34316b.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/3_hu_8ea5ab9339378f5.webp"
width="760"
height="439"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/4_hu_c7bbc275cd63cefb.webp 400w,
/post/2025/03/23/introducing-zensvi/4_hu_5525d6469e49e0a5.webp 760w,
/post/2025/03/23/introducing-zensvi/4_hu_cb8e849226c86566.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/4_hu_c7bbc275cd63cefb.webp"
width="760"
height="319"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/5_hu_95a3659ea9672b32.webp 400w,
/post/2025/03/23/introducing-zensvi/5_hu_e4d633f38fbe3eeb.webp 760w,
/post/2025/03/23/introducing-zensvi/5_hu_b3c3431c0a5d0f53.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/5_hu_95a3659ea9672b32.webp"
width="760"
height="686"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/6_hu_dcd7c17710c0e54d.webp 400w,
/post/2025/03/23/introducing-zensvi/6_hu_85717a1f5384e002.webp 760w,
/post/2025/03/23/introducing-zensvi/6_hu_21f6a59adc3eb109.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/6_hu_dcd7c17710c0e54d.webp"
width="760"
height="583"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/7_hu_cb13d82b2ba9c13b.webp 400w,
/post/2025/03/23/introducing-zensvi/7_hu_8a257ba11e552339.webp 760w,
/post/2025/03/23/introducing-zensvi/7_hu_803c3fd5b2d69a64.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/7_hu_cb13d82b2ba9c13b.webp"
width="760"
height="427"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h2 id="highlights"&gt;Highlights&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Streamlined the complete street view imagery analytics process in Python.&lt;/li&gt;
&lt;li&gt;Integration of advanced image processing and analysis techniques.&lt;/li&gt;
&lt;li&gt;Robust transformation and customization tools that can be extended by the open source community.&lt;/li&gt;
&lt;li&gt;Framework to support various SVI analyses such as quality assessment, clustering, and green view index construction.&lt;/li&gt;
&lt;li&gt;Lowering technical barriers and increasing equity of participation in urban studies.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="abstract"&gt;Abstract&lt;/h2&gt;
&lt;blockquote&gt;
&lt;p&gt;Street view imagery (SVI) has been instrumental in many studies in the past decade to understand and characterize street features and the built environment. Researchers across a variety of domains, such as transportation, health, architecture, human perception, and infrastructure have employed different methods to analyze SVI. However, these applications and image-processing procedures have not been standardized, and solutions have been implemented in isolation, often making it difficult for others to reproduce existing work and carry out new research. Using SVI for research requires multiple technical steps: accessing APIs for scalable data collection, preprocessing images to standardize formats, implementing computer vision models for feature extraction, and conducting spatial analysis. These technical requirements create barriers for researchers in urban studies, particularly those without extensive programming experience. We developed ZenSVI, a free and open-source Python package that integrates and implements the entire process of SVI analysis, supporting a wide range of use cases. Its end-to-end pipeline includes downloading SVI from multiple platforms (e.g., Mapillary and KartaView) efficiently, analyzing metadata of SVI, applying computer vision models to extract target features, transforming SVI into different projections (e.g., fish-eye and perspective) and different formats (e.g., depth map and point cloud), visualizing analyses with maps and plots, and exporting outputs to other software tools. We demonstrated its use in Singapore through a case study of data quality assessment and clustering analysis in a streamlined manner. Our software improves the transparency, reproducibility, and scalability of research relying on SVI and supports researchers in conducting urban analyses efficiently. Its modular design facilitates extensions of the package for new use cases. This package is openly available at &lt;a href="https://github.com/koito19960406/ZenSVI" target="_blank" rel="noopener"&gt;https://github.com/koito19960406/ZenSVI&lt;/a&gt;, and it is supported by documentation including tutorials (&lt;a href="https://zensvi.readthedocs.io/en/latest/examples/index.html%29" target="_blank" rel="noopener"&gt;https://zensvi.readthedocs.io/en/latest/examples/index.html)&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-ceus-zensvi/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-ceus-zensvi/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/23/introducing-zensvi/page-one_hu_b04fd9213a7cc33a.webp 400w,
/post/2025/03/23/introducing-zensvi/page-one_hu_6b08868ddb9ceefb.webp 760w,
/post/2025/03/23/introducing-zensvi/page-one_hu_1859a37733ed1134.webp 1200w"
src="https://ual.sg/post/2025/03/23/introducing-zensvi/page-one_hu_b04fd9213a7cc33a.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_ceus_zensvi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ito, Koichi and Zhu, Yihan and Abdelrahman, Mahmoud and Liang, Xiucheng and Fan, Zicheng and Hou, Yujun and Zhao, Tianhong and Ma, Rui and Fujiwara, Kunihiko and Ouyang, Jiani and Quintana, Matias and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2025.102283}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102283}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ZenSVI: An open-source software for the integrated acquisition, processing and analysis of street view imagery towards scalable urban science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{119}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: What is a Digital Twin anyway?</title><link>https://ual.sg/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/</link><pubDate>Wed, 05 Mar 2025 18:30:22 +0800</pubDate><guid>https://ual.sg/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Abdelrahman M, Macatulad E, Lei B, Quintana M, Miller C, Biljecki F (2025): What is a Digital Twin anyway? Deriving the definition for the built environment from over 15,000 scientific publications. Building and Environment, 274: 112748. &lt;a href="https://doi.org/10.1016/j.buildenv.2025.112748" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2025.112748&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-bae-dt-definition/2025-bae-dt-definition.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/mahmoud-abdelrahman/"&gt;Mahmoud Abdelrahman&lt;/a&gt;.
Congratulations on his new publication that is the first one while working in our lab as a postdoctoral research fellow! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The GitHub repository with the open-source code we developed can be found &lt;a href="https://github.com/ualsg/dt-definitions" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/1_hu_908f487e31e2f4ec.webp 400w,
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/1_hu_13c4a6c2c995879c.webp 760w,
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/1_hu_fcc5cef5a15cc79f.webp 1200w"
src="https://ual.sg/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/1_hu_908f487e31e2f4ec.webp"
width="760"
height="629"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Digital Twin (DT) is a widely used but still fuzzy term.&lt;/li&gt;
&lt;li&gt;Its existing definitions differ significantly across domains.&lt;/li&gt;
&lt;li&gt;Built Environment DTs rely on short to long term data update frequencies.&lt;/li&gt;
&lt;li&gt;Building and Urban DTs are in early stages of development.&lt;/li&gt;
&lt;li&gt;AI/ML integration in Urban DTs is emerging but not yet widespread.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/2_hu_f911895d5206394b.webp 400w,
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/2_hu_cdf211d3b7a2be0f.webp 760w,
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/2_hu_e32ce709b48a820d.webp 1200w"
src="https://ual.sg/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/2_hu_f911895d5206394b.webp"
width="760"
height="371"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The concept of Digital Twins (DT) has attracted significant attention across various domains, particularly within the built environment. However, there is a sheer volume of definitions and the terminological consensus remains out of reach. The lack of a universally accepted definition leads to ambiguities in their conceptualization and implementation, and may cause miscommunication for both researchers and practitioners. We employed Natural Language Processing (NLP) techniques to systematically extract and analyze definitions of DTs from a corpus of more than 15,000 full-text articles spanning diverse disciplines. The study compares these findings with insights from an expert survey that included 52 experts. The study identifies concurrence on the components that comprise a “Digital Twin” from a practical perspective across various domains, contrasting them with those that do not, to identify deviations. We investigate the evolution of digital twin definitions over time and across different scales, including manufacturing, building, and urban/geospatial perspectives. We extracted the main components of Digital Twins using Text Frequency Analysis and N-gram analysis. Subsequently, we identified components that appeared in the literature and conducted a Chi-square test to assess the significance of each component in different domains. Our analysis identified key components of digital twins and revealed significant variations in definitions based on application domains, such as manufacturing, building, and urban contexts. The analysis of DT components reveal two major groups of DT types: High-Performance Real-Time (HPRT) DTs, and Long-Term Decision Support (LTDS) DTs. Contrary to common assumptions, we found that components such as simulation, AI/ML, real-time capabilities, and bi-directional data flow are not yet fully mature in the digital twins of the built environment. We derived two definitions for the Building/Architecture DT and the City/Urban DTs. Both definitions have a must-have components (such as spatial and temporal data updates) and good-to-have components such as prediction, AI, bi-directional data flow, and Real-time data exchange. One of the key findings is that the definition of digital twins has not yet reached its equilibrium phase, highlighting the need for ongoing revisions as technologies emerge or existing ones become obsolete. To address this, we introduce a novel, reproducible methodology that enables researchers to refine and adapt the current definitions in response to technological advancements or deprecations.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-bae-dt-definition/"&gt;paper&lt;/a&gt;. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-bae-dt-definition/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/page-one_hu_63462186fcf7c783.webp 400w,
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/page-one_hu_714eb10673104a3f.webp 760w,
/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/page-one_hu_ec9cd10b655b6b6c.webp 1200w"
src="https://ual.sg/post/2025/03/05/new-paper-what-is-a-digital-twin-anyway/page-one_hu_63462186fcf7c783.webp"
width="562"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_bae_dt_definition&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Abdelrahman, Mahmoud and Macatulad, Edgardo and Lei, Binyu and Quintana, Matias and Miller, Clayton and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2025.112748}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112748}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{274}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{What is a Digital Twin anyway? Deriving the definition for the built environment from over 15,000 scientific publications}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Formalising the urban pattern language</title><link>https://ual.sg/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/</link><pubDate>Wed, 05 Mar 2025 17:30:22 +0800</pubDate><guid>https://ual.sg/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wu C, Wang J, Wang M, Biljecki F, Kraak MJ (2025): Formalising the urban pattern language: A morphological paradigm towards understanding the multi-scalar spatial structure of cities. Cities, 161: 105854. &lt;a href="https://doi.org/10.1016/j.cities.2025.105854" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2025.105854&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-cities-pattern-language/2025-cities-pattern-language.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/cai-wu/"&gt;Cai Wu&lt;/a&gt;.
Congratulations on his new publication that is part of his PhD! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/1_hu_bfe35e868d46a298.webp 400w,
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/1_hu_ee35ac9d18652134.webp 760w,
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/1_hu_90b3ce6df732e653.webp 1200w"
src="https://ual.sg/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/1_hu_bfe35e868d46a298.webp"
width="760"
height="548"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Develop a quantitative method for multi-scale urban morphology analysis using urban pattern language.&lt;/li&gt;
&lt;li&gt;Quantify selected urban patterns at various scales, demonstrating their structured, non-arbitrary relationships.&lt;/li&gt;
&lt;li&gt;Show how urban pattern language reflects distinct urban contexts and characteristics through case studies.&lt;/li&gt;
&lt;li&gt;Highlight practical applications in planning and design, aiding contextual, sustainable, and informed decision-making.&lt;/li&gt;
&lt;li&gt;Identify future research opportunities by showcasing adaptability to diverse urban contexts and data availability.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/2_hu_d04f306857e083c1.webp 400w,
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/2_hu_76342373e7d27ca.webp 760w,
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/2_hu_7fda173a06ff14c2.webp 1200w"
src="https://ual.sg/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/2_hu_d04f306857e083c1.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The urban form is a foundational element in urban analytics, planning, and design. However, systematic and consistent depiction of urban form is challenging due to the complexity of urban elements and the variety of scales involved. This paper formalizes the concept of ‘urban pattern language’ as a multi-scalar analytical approach to decode such complexity, drawing on Christopher Alexander&amp;rsquo;s idea that offers solutions for recurrent design problems observed in historic and contemporary urban settings. This analytic approach is applied to two case study cities to explore how urban forms can be decoded and communicated across scales and demonstrate how urban morphological elements can be systematically organised into recognisable patterns that simplify analysis and enhance understanding. The findings show that these patterns are not arbitrary but follow structured, rule-based relationships that vary across scales, revealing an underlying order within the urban form. Finally, the study illustrates that these rules are unique to each city, potentially reflecting specific cultural, historical, and spatial contexts. By identifying city-specific, multi-scalar patterns, this framework offers a powerful framework for urban planning and design, allowing practitioners to develop adaptable and context-sensitive strategies.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-cities-pattern-language/"&gt;paper&lt;/a&gt;. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-cities-pattern-language/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/page-one_hu_ad34e4e15fe607ff.webp 400w,
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/page-one_hu_d869b8dab2ebf4ee.webp 760w,
/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/page-one_hu_e9031bf4de21bbf.webp 1200w"
src="https://ual.sg/post/2025/03/05/new-paper-formalising-the-urban-pattern-language/page-one_hu_ad34e4e15fe607ff.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_cities_pattern_language&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wu, Cai and Wang, Jiong and Wang, Mingshu and Biljecki, Filip and Kraak, Menno-Jan}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2025.105854}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105854}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Formalising the urban pattern language: A morphological paradigm towards understanding the multi-scalar spatial structure of cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{161}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Crime-associated inequality in geographical access to education</title><link>https://ual.sg/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/</link><pubDate>Tue, 04 Mar 2025 16:01:09 +0800</pubDate><guid>https://ual.sg/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Knoblauch S, Muthusamy RK, Moritz M, Kang Y, Li H, Lautenbach S, Pereira RHM, Biljecki F, Gonzalez MC, Barbosa R, Hirata DV, Ludwig C, Adamiak M, de A Rocha AA, Zipf A (2024): Crime-associated inequality in geographical access to education: Insights from the municipality of Rio de Janeiro. Cities 160: 105818. &lt;a href="https://doi.org/10.1016/j.cities.2025.105818" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2025.105818&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-cities-crime-education/2025-cities-crime-education.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://www.geog.uni-heidelberg.de/gis/knoblauch.html" target="_blank" rel="noopener"&gt;Steffen Knoblauch&lt;/a&gt; from the &lt;a href="https://www.geog.uni-heidelberg.de/gis/index_en.html" target="_blank" rel="noopener"&gt;GIScience Research Group&lt;/a&gt; at Heidelberg University in Germany.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://doi.org/10.1016/j.cities.2025.105818" target="_blank" rel="noopener"&gt;available open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/1_hu_5aeaa0be83301bc6.webp 400w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/1_hu_414e1e0307855ee.webp 760w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/1_hu_6d2bc1706cbb2055.webp 1200w"
src="https://ual.sg/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/1_hu_5aeaa0be83301bc6.webp"
width="751"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h2 id="highlights"&gt;Highlights&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Crime-conscious routing can reveal inequality in geographic access to education.&lt;/li&gt;
&lt;li&gt;GeoAI can support downscaling of crime records to street level.&lt;/li&gt;
&lt;li&gt;In Rio de Janeiro, dispute areas increase travel time to the nearest school by 48.6 %.&lt;/li&gt;
&lt;li&gt;Findings support targeted interventions to improve school access in high-crime areas.&lt;/li&gt;
&lt;li&gt;Method adaptable to a broad range of urban access studies across diverse cities.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/2_hu_ef853667673fd390.webp 400w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/2_hu_64cbbd5989bd9ad6.webp 760w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/2_hu_a40719d702cf4ca7.webp 1200w"
src="https://ual.sg/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/2_hu_ef853667673fd390.webp"
width="760"
height="313"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Education is a fundamental right, supported by initiatives like Education for All (EFA) and the Millennium Development Goals (MDGs). Despite progress, full educational access remains challenging, particularly in highly criminal areas. This paper examines the impact of crime on school access in the municipality of Rio de Janeiro. Using ancillary data and geospatial artificial intelligence (GeoAI), we downscaled official police crime records to street level. By considering different levels of crime tolerance in school path choices, we simulated how crime can force students to walk longer distances to avoid violence. Our findings indicate a 48.60 % average increase in travel time to the closest school for students whose shortest routes intersect with high-crime areas. This adjustment reduces mean crime exposure by 44.10 % and maximum exposure by 81.94 %. Both individual crime risk aversion and no-go areas of criminal disputes significantly (p 0.05) impacted educational access. Estimating street-level crime exposure was challenging due to spatial bias in official and crowdsourced crime reporting. These methods and insights are crucial for improving educational access in high-crime areas, providing a better understanding of barriers to equitable education, and being applicable to other cities and accessibility studies for various societal needs.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/3_hu_f2100270027cfb9e.webp 400w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/3_hu_2747b94717e7047c.webp 760w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/3_hu_14273fe21cd12708.webp 1200w"
src="https://ual.sg/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/3_hu_f2100270027cfb9e.webp"
width="760"
height="391"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-cities-crime-education/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-cities-crime-education/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/page-one_hu_f153d677524fa883.webp 400w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/page-one_hu_37a54b4ec4768886.webp 760w,
/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/page-one_hu_a0ddc44c6cda8173.webp 1200w"
src="https://ual.sg/post/2025/03/04/new-paper-crime-associated-inequality-in-geographical-access-to-education/page-one_hu_f153d677524fa883.webp"
width="591"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_cities_crime_education&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Knoblauch, Steffen and Muthusamy, Ram Kumar and Moritz, Maya and Kang, Yuhao and Li, Hao and Lautenbach, Sven and Pereira, Rafael H.M. and Biljecki, Filip and Gonzalez, Marta C. and Barbosa, Rogerio and Hirata, Daniel Veloso and Ludwig, Christina and Adamiak, Maciej and de A. Rocha, Antônio A. and Zipf, Alexander}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2025.105818}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105818}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Crime-associated inequality in geographical access to education: Insights from the municipality of Rio de Janeiro}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{160}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Keynote at the PLATEAU Symposium in Tokyo</title><link>https://ual.sg/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/</link><pubDate>Tue, 18 Feb 2025 17:24:29 +0800</pubDate><guid>https://ual.sg/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/</guid><description>&lt;p&gt;The PI of the Urban Analytics Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, has been invited to be the keynote speaker at the PLATEAU Symposium of Japan&amp;rsquo;s &lt;a href="https://www.mlit.go.jp/en/" target="_blank" rel="noopener"&gt;Ministry of Land, Infrastructure, Transport and Tourism&lt;/a&gt;, presenting to 300 digital twin enthusiasts in the Japanese geospatial community.
This invitation is very much appreciated.
Big thanks also to AIGID (Association for Promotion of Infrastructure Geospatial Information Distribution). 🇯🇵&lt;/p&gt;
&lt;p&gt;The presentation included sharing some views and challenges on urban digital twins against the backdrop of their promise and momentum, and gave a sneak peek into some developments from our &lt;a href="https://ual.sg/"&gt;Urban Analytics Lab&lt;/a&gt; and other groups at the National University of Singapore.
This invitation affirms the relevance of our research and developments in the domain of urban digital twins.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/1_hu_32b0596c71cd56f2.webp 400w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/1_hu_be4be37d9a544fc4.webp 760w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/1_hu_2fefd822d96c886c.webp 1200w"
src="https://ual.sg/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/1_hu_32b0596c71cd56f2.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Japan&amp;rsquo;s advancements in urban digital twins are truly becoming a global benchmark. 🌏&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.mlit.go.jp/en/toshi/daisei/plateau_en_2.html" target="_blank" rel="noopener"&gt;PLATEAU&lt;/a&gt; is a 3D urban model development, utilisation, and open data project led by MLIT - among the best urban digital twin efforts globally, with quite positive points of nurturing the triangle of government-industry-academia and the impressive involvement of hundreds of cities. The Japanese authorities have recognised the importance of open developments, achieving a unique and comprehensive ecosystem that does not include only data &amp;amp; systems but it also encompasses a vibrant community that comes together through a series of events and initiatives.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/2_hu_aa6e80f8b42516c8.webp 400w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/2_hu_c954ff5582d975f0.webp 760w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/2_hu_48670557be289ceb.webp 1200w"
src="https://ual.sg/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/2_hu_aa6e80f8b42516c8.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;It was great to exchange ideas with experts from MLIT, including Ms Yuka Sogawa, Dr Muto Sachio and others, as well as panelists Ms Chikako Kurokawa (Asia Air Survey), Mr Michio Oda (Kukusai Kogyo), Mr Kenya Tamura (Eukarya), Mr Yukihiro Nakajima (Asia Air Survey), Mr Shigeru Chiba (NTT InfraNet), and other Japanese digital twin champions.&lt;/p&gt;
&lt;p&gt;Thanks to MLIT, AIGID, &amp;amp; PLATEAU team (and many others involved including Ms Yu Nagura and Prof Yoshihide Sekimoto) for the inspiring leadership and activities.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/3_hu_6060fcf8ee91efbd.webp 400w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/3_hu_cef742baa4b730a0.webp 760w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/3_hu_c1f943bd8b6d9610.webp 1200w"
src="https://ual.sg/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/3_hu_6060fcf8ee91efbd.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/4_hu_fa9330d040c77ef9.webp 400w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/4_hu_c0d23d1047c7a054.webp 760w,
/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/4_hu_125db4f98e79cc28.webp 1200w"
src="https://ual.sg/post/2025/02/18/keynote-at-the-plateau-symposium-in-tokyo/4_hu_fa9330d040c77ef9.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Thermal comfort in sight</title><link>https://ual.sg/post/2025/01/25/new-paper-thermal-comfort-in-sight/</link><pubDate>Sat, 25 Jan 2025 08:27:22 +0800</pubDate><guid>https://ual.sg/post/2025/01/25/new-paper-thermal-comfort-in-sight/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yang S, Chong A, Liu P, Biljecki F (2025): Thermal comfort in sight: Thermal affordance and its visual assessment for sustainable streetscape design. Building and Environment, 271: 112569. &lt;a href="https://doi.org/10.1016/j.buildenv.2025.112569" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2025.112569&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-bae-thermal/2025-bae-thermal.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/sijie-yang/"&gt;Sijie Yang&lt;/a&gt;.
Congratulations on his first PhD journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The code and dataset have been released openly at &lt;a href="https://github.com/Sijie-Yang/Thermal-Affordance" target="_blank" rel="noopener"&gt;GitHub&lt;/a&gt;.
The computed VATA data for thermal affordance in Singapore can be accessed &lt;a href="https://thermal-affordance.ual.sg/" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1kUxQ1HudNJSL2" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-03-15.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/25/new-paper-thermal-comfort-in-sight/1_hu_2e99a49ab0b8ee15.webp 400w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/1_hu_aa86394bf12cc047.webp 760w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/1_hu_5ca83338d74d8734.webp 1200w"
src="https://ual.sg/post/2025/01/25/new-paper-thermal-comfort-in-sight/1_hu_2e99a49ab0b8ee15.webp"
width="529"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/25/new-paper-thermal-comfort-in-sight/2_hu_6c3104b3c6c69acc.webp 400w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/2_hu_5d45d5206b145311.webp 760w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/2_hu_3f7c451340bb92a4.webp 1200w"
src="https://ual.sg/post/2025/01/25/new-paper-thermal-comfort-in-sight/2_hu_6c3104b3c6c69acc.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Introducing thermal affordance, linking built environments to thermal comfort.&lt;/li&gt;
&lt;li&gt;Developing a framework (VATA) to assess thermal affordance using street view images.&lt;/li&gt;
&lt;li&gt;Applying SVI and human visual assessment data for two-stage VATA modelling.&lt;/li&gt;
&lt;li&gt;VATA modelling performance is validated by in-field thermal comfort investigation.&lt;/li&gt;
&lt;li&gt;Urban-scale VATA mapping supports sustainable streetscape planning and design.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/25/new-paper-thermal-comfort-in-sight/3_hu_158057dac91f2f34.webp 400w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/3_hu_67341620ae7d3401.webp 760w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/3_hu_9fe1d9d6bb3c0c1e.webp 1200w"
src="https://ual.sg/post/2025/01/25/new-paper-thermal-comfort-in-sight/3_hu_158057dac91f2f34.webp"
width="760"
height="462"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;In response to climate change and urban heat island effects, enhancing human thermal comfort in cities is crucial for sustainable urban development. Traditional methods for investigating the urban thermal environment and corresponding human thermal comfort level are often resource intensive, inefficient, and limited in scope. To address these challenges, we (1) introduce a new concept named thermal affordance, which formalises the integrated inherent capacity of a streetscape to influence human thermal comfort based on its visual and physical features; and (2) an efficient method to evaluate it (visual assessment of thermal affordance — VATA), which combines street view imagery (SVI), online and in-field surveys, and statistical learning algorithms. VATA extracts five categories of image features from SVI data and establishes 19 visual-perceptual indicators for streetscape visual assessment. Using a multi-task neural network and elastic net regression, we model their chained relationship to predict and comprehend thermal affordance for Singapore. VATA predictions are validated with field-investigated OTC data, providing a cost-effective, scalable, and transferable method to assess the thermal comfort potential of urban streetscape. Moreover, we demonstrate its utility by generating a geospatially explicit mapping of thermal affordance, outlining a model update workflow for long-term urban-scale analysis, and implementing a two-stage prediction and inference approach (IF-VPI-VATA) to guide future streetscape improvements. This framework can inform streetscape design to support sustainable, liveable, and resilient urban environments.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-bae-thermal/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-bae-thermal/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/25/new-paper-thermal-comfort-in-sight/page-one_hu_15cb11aec93a7b76.webp 400w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/page-one_hu_4592c14fc3165b0b.webp 760w,
/post/2025/01/25/new-paper-thermal-comfort-in-sight/page-one_hu_c73791b22f965065.webp 1200w"
src="https://ual.sg/post/2025/01/25/new-paper-thermal-comfort-in-sight/page-one_hu_15cb11aec93a7b76.webp"
width="558"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_bae_thermal&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yang, Sijie and Chong, Adrian and Liu, Pengyuan and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2025.112569}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112569}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Thermal comfort in sight: Thermal affordance and its visual assessment for sustainable streetscape design}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{271}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Coverage and bias of street view imagery in mapping the urban environment</title><link>https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/</link><pubDate>Fri, 24 Jan 2025 19:43:22 +0800</pubDate><guid>https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Fan Z, Feng CC, Biljecki F (2025): Coverage and bias of street view imagery in mapping the urban environment. Computers, Environment and Urban Systems, 117: 102253. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2025.102253" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2025.102253&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-ceus-svi-coverage/2025-ceus-svi-coverage.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/zicheng-fan/"&gt;Zicheng Fan&lt;/a&gt;.
Congratulations on his new publication that is part of his PhD, and on his continued successes! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1kUW1jFQh4Mk3" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-03-14.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/1_hu_d5f5d3e97693684f.webp 400w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/1_hu_228dddf8321ddf82.webp 760w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/1_hu_419cdec2a1d9ba4b.webp 1200w"
src="https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/1_hu_d5f5d3e97693684f.webp"
width="760"
height="542"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;SVI is used widely but without much understanding of the coverage and bias.&lt;/li&gt;
&lt;li&gt;A computational approach to estimate the coverage of urban environment elements in SVI.&lt;/li&gt;
&lt;li&gt;Introducing frequency and completeness as metrics in measuring coverage.&lt;/li&gt;
&lt;li&gt;SVI covers 62.4 % of buildings in London, with mean completeness on facade of 12.4 %.&lt;/li&gt;
&lt;li&gt;Estimating optimal sampling interval range of 50–60 m for SVI collection.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/2_hu_5c7da875901151b2.webp 400w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/2_hu_5060c53de62c5233.webp 760w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/2_hu_d8b881c27a14b25.webp 1200w"
src="https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/2_hu_5c7da875901151b2.webp"
width="760"
height="410"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Street View Imagery (SVI) has emerged as a valuable data form in urban studies, enabling new ways to map and sense urban environments. However, fundamental concerns regarding the representativeness, quality, and reliability of SVI remain underexplored, e.g. to what extent can cities be captured by such data and do data gaps result in bias. This research, positioned at the intersection of spatial data quality and urban analytics, addresses these concerns by proposing a novel and effective method to estimate SVI&amp;rsquo;s element-level coverage in the urban environment. The method integrates the positional relationships between SVI and target elements, as well as the impact of physical obstructions. Expanding the domain of data quality to SVI, we introduce an indicator system that evaluates the extent of coverage, focusing on the completeness and frequency dimensions. Taking London as a case study, three experiments are conducted to identify potential biases in SVI&amp;rsquo;s ability to cover and represent urban environmental elements, using building facades as an example. It is found that despite their high availability along urban road networks, Google Street View covers only 62.4 % of buildings in the case study area. The average facade coverage per building is 12.4 %. SVI tends to over-represent non-residential buildings, thus possibly resulting in biased analyses, and its coverage of environmental elements is position-dependent. The research also highlights the variability of SVI coverage under different data acquisition practices and proposes an optimal sampling interval range of 50–60 m for SVI collection. The findings suggest that while SVI offers valuable insights, it is no panacea – its application in urban research requires careful consideration of data coverage and element-level representativeness to ensure reliable results.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/3_hu_39cc897d8bf940f2.webp 400w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/3_hu_889da76948515d31.webp 760w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/3_hu_296368d007412820.webp 1200w"
src="https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/3_hu_39cc897d8bf940f2.webp"
width="760"
height="481"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/4_hu_3f363e705daef37d.webp 400w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/4_hu_abe8aa7d08ec6bbd.webp 760w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/4_hu_410aa53c2396c056.webp 1200w"
src="https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/4_hu_3f363e705daef37d.webp"
width="760"
height="731"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-ceus-svi-coverage/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-ceus-svi-coverage/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/page-one_hu_259d9853ae5ed0c0.webp 400w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/page-one_hu_2e20cb9c536287c1.webp 760w,
/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/page-one_hu_3e93479bd945e567.webp 1200w"
src="https://ual.sg/post/2025/01/24/new-paper-coverage-and-bias-of-street-view-imagery-in-mapping-the-urban-environment/page-one_hu_259d9853ae5ed0c0.webp"
width="577"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_ceus_svi_coverage&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Fan, Zicheng and Feng, Chen-Chieh and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2025.102253}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102253}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Coverage and bias of street view imagery in mapping the urban environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{117}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visits to 15 universities in China</title><link>https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/</link><pubDate>Thu, 23 Jan 2025 15:01:29 +0800</pubDate><guid>https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/</guid><description>&lt;p&gt;Over the last half year, the PI of the Urban Analytics Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, has been invited to visit 15 universities across China to give guest lectures and presentations, showcase the work of our research group, and establish collaborations.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/collage-1_hu_3f4f699c647f98ce.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/collage-1_hu_60d06b5c122a7b80.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/collage-1_hu_c821e9b897029d92.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/collage-1_hu_3f4f699c647f98ce.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;It has been an enriching experience and impressive in many ways &amp;ndash; from the quality of research and the beautiful campuses to the very driven and talented researchers.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/collage-2_hu_d613a087382046a4.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/collage-2_hu_cc7da62f412178e4.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/collage-2_hu_b2f9a4612c6ebe0.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/collage-2_hu_d613a087382046a4.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;A big thank you to all the hosts for their wonderful hospitality at (in chronological order): Peking University, Tsinghua University, Hong Kong University of Science and Technology (Guangzhou), South China University of Technology, Nanjing University, Nanjing Normal University, Southeast University, Shenzhen University, Shenzhen Technology University, The Chinese University of Hong Kong, The University of Hong Kong, City University of Hong Kong, The Hong Kong Polytechnic University, Zhejiang A&amp;amp;F University, and Zhejiang University.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-tsinghua-university-in-beijing"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Tsinghua University in Beijing" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/1_hu_19a8a904f1788deb.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/1_hu_ed6695e49ce0a212.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/1_hu_3e7e7be3dfc6f1e1.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/1_hu_19a8a904f1788deb.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Tsinghua University in Beijing
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-tsinghua-university-in-beijing"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Tsinghua University in Beijing" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/2_hu_8514cd26b6fd935c.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/2_hu_cc6521a1095780b.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/2_hu_a1c274fb3364ab19.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/2_hu_8514cd26b6fd935c.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Tsinghua University in Beijing
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-hong-kong-university-of-science-and-technology-guangzhou"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Hong Kong University of Science and Technology (Guangzhou)" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/3_hu_d514e69b2d503dc3.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/3_hu_53aada7b299c4cab.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/3_hu_5d2d881fc77162f5.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/3_hu_d514e69b2d503dc3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Hong Kong University of Science and Technology (Guangzhou)
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-hong-kong-university-of-science-and-technology-guangzhou"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Hong Kong University of Science and Technology (Guangzhou)" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/4_hu_6ee6dc2f15a880f2.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/4_hu_9ae52f1c9ff1963.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/4_hu_106104b414b743a9.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/4_hu_6ee6dc2f15a880f2.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Hong Kong University of Science and Technology (Guangzhou)
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-southeast-university-in-nanjing"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Southeast University in Nanjing" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/5_hu_63e1a832aa6d5772.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/5_hu_7247e7beaf4decbb.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/5_hu_aa9ca256d5eb8dcd.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/5_hu_63e1a832aa6d5772.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Southeast University in Nanjing
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-peking-university-in-beijing"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Peking University in Beijing" srcset="
/post/2025/01/23/visits-to-15-universities-in-china/6_hu_5586af0840f686a5.webp 400w,
/post/2025/01/23/visits-to-15-universities-in-china/6_hu_ce6cda61ecc599cc.webp 760w,
/post/2025/01/23/visits-to-15-universities-in-china/6_hu_7e32c4347c0e4a05.webp 1200w"
src="https://ual.sg/post/2025/01/23/visits-to-15-universities-in-china/6_hu_5586af0840f686a5.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Peking University in Beijing
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;It was a pleasure to be in the company of exceptional scholars and learn more about their work.
We look forward to continue collaborating with these wonderful research groups.&lt;/p&gt;</description></item><item><title>New paper: Developing the urban comfort index</title><link>https://ual.sg/post/2025/01/15/new-paper-developing-the-urban-comfort-index/</link><pubDate>Wed, 15 Jan 2025 16:30:22 +0800</pubDate><guid>https://ual.sg/post/2025/01/15/new-paper-developing-the-urban-comfort-index/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Liu P, Liang X, Yan Y, Biljecki F (2025): Developing the urban comfort index: Advancing liveability analytics with a multidimensional approach and explainable artificial intelligence. Sustainable Cities and Society, 120: 106121. &lt;a href="https://doi.org/10.1016/j.scs.2024.106121" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.scs.2024.106121&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-scs-urban-comfort/2025-scs-urban-comfort.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;.
Congratulations on her new publication that is part of her PhD, and on her continued successes! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1kRMe7sfVZE4tM" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-03-10.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/1_hu_8c0342aea2651afe.webp 400w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/1_hu_7d754e1a8f8b82d9.webp 760w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/1_hu_e94fb5821a659af5.webp 1200w"
src="https://ual.sg/post/2025/01/15/new-paper-developing-the-urban-comfort-index/1_hu_8c0342aea2651afe.webp"
width="760"
height="381"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A multidimensional framework for measuring urban comfort.&lt;/li&gt;
&lt;li&gt;An innovative graph structure of neighbourhoods to represent spatial contexts.&lt;/li&gt;
&lt;li&gt;Explainable AI to interpret the non-linear and intricate impacts of urban features.&lt;/li&gt;
&lt;li&gt;Urban comfort demonstrates spatial variation and temporal dynamics.&lt;/li&gt;
&lt;li&gt;A practical use case for simulating scenarios to enhance local comfort.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/2_hu_16d5ec71b4d7f9fc.webp 400w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/2_hu_d95c45b15bc76c81.webp 760w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/2_hu_6772eed3a944c41b.webp 1200w"
src="https://ual.sg/post/2025/01/15/new-paper-developing-the-urban-comfort-index/2_hu_16d5ec71b4d7f9fc.webp"
width="760"
height="312"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban comfort is a means of measuring the dynamic quality of urban life as an outcome of the interaction between humans and urban environments, capturing spatio-temporal phenomena in cities. We design a multidimensional urban comfort framework encompassing 44 features, to comprehensively represent urban living environments, based on 3D urban morphology, socio-economic features, human perception, and environmental factors. We develop a graph-based approach to measure urban comfort through an index and explain its driving forces by exploiting spatial relationships between urban comfort and surrounding features. Explainable artificial intelligence (XAI) is leveraged to interpret feature importance and inherent complexity in urban contexts, advancing conventional methods that are limited to linear relationships. We implement the framework in Amsterdam, generating a city-wide comfort index. Compared to the baseline random forest model, our graph-based approach demonstrates competitive performance in measuring the urban comfort index, achieving an MAE of 1.03, an RMSE of 2.04, and an R-squared value of 93.6%. Meanwhile, we visualise how the urban comfort index changes across quarters, examining the spatio-temporal dynamics at the neighbourhood level. Furthermore, we employ XAI to explain the positive and negative impacts of urban features by categorising neighbourhoods into high and low-comfort groups, indicating the varied contributions of urban features. Exploring the usability of the urban comfort index, we simulate various urban strategies in a neighbourhood of interest benefiting from urban digital twins (e.g. improving air quality to mitigate its negative impact on urban comfort). The urban comfort study demonstrates the potential to address information gaps by incorporating multidimensional features in cities, thereby providing insights into understanding and interpreting local comfort. It can further serve as an instrument to inform neighbourhood design, suggest feasible strategies, and indicate far-reaching implications for urban health and wellbeing.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/3_hu_f3848db36af4f5f0.webp 400w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/3_hu_19436abdef9dcbb1.webp 760w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/3_hu_988e4e33e699b857.webp 1200w"
src="https://ual.sg/post/2025/01/15/new-paper-developing-the-urban-comfort-index/3_hu_f3848db36af4f5f0.webp"
width="760"
height="322"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-scs-urban-comfort/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-scs-urban-comfort/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/page-one_hu_851310e3868ba7d1.webp 400w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/page-one_hu_586a5a4eab1e29d5.webp 760w,
/post/2025/01/15/new-paper-developing-the-urban-comfort-index/page-one_hu_63ffc7281229f877.webp 1200w"
src="https://ual.sg/post/2025/01/15/new-paper-developing-the-urban-comfort-index/page-one_hu_851310e3868ba7d1.webp"
width="561"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_scs_urban_comfort&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Liu, Pengyuan and Liang, Xiucheng and Yan, Yingwei and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.scs.2024.106121}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sustainable Cities and Society}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{106121}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Developing the urban comfort index: Advancing liveability analytics with a multidimensional approach and explainable artificial intelligence}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{120}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Sensing climate justice</title><link>https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/</link><pubDate>Thu, 19 Dec 2024 17:31:22 +0800</pubDate><guid>https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liu P, Lei B, Huang W, Biljecki F, Wang Y, Li S, Stouffs R (2025): Sensing climate justice: A multi-hyper graph approach for classifying urban heat and flood vulnerability through street view imagery. Sustainable Cities and Society, 118: 106016. &lt;a href="https://doi.org/10.1016/j.scs.2024.106016" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.scs.2024.106016&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-scs-climate/2025-scs-climate.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;.
Congratulations on his continued successes! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/c/1kFpz7sfVZE4dA" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2025-01-31.&lt;/p&gt;
&lt;p&gt;The code has been released &lt;a href="https://github.com/PengyuanLiu1993/SensingUrbanClimate" target="_blank" rel="noopener"&gt;openly&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/19/new-paper-sensing-climate-justice/1_hu_1c278466014aceb1.webp 400w,
/post/2024/12/19/new-paper-sensing-climate-justice/1_hu_d4272ff6f1a20288.webp 760w,
/post/2024/12/19/new-paper-sensing-climate-justice/1_hu_1136b3c3d3b74c1d.webp 1200w"
src="https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/1_hu_1c278466014aceb1.webp"
width="760"
height="445"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A dual-GNN to model multifaceted spatial patterns for studying urban climate justice.&lt;/li&gt;
&lt;li&gt;Spatially-explicit GeoAI incorporates Laws of Geography.&lt;/li&gt;
&lt;li&gt;Nearly 24% performance improvement compared to conventional spatial modelling methods.&lt;/li&gt;
&lt;li&gt;Socio-economic indicators are crucial for understanding urban climate vulnerabilities.&lt;/li&gt;
&lt;li&gt;Spatial structures from multiple levels contribute holistically to the urban climate justice classification.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/19/new-paper-sensing-climate-justice/2_hu_80457bffc96cc774.webp 400w,
/post/2024/12/19/new-paper-sensing-climate-justice/2_hu_d5a434ce4535e406.webp 760w,
/post/2024/12/19/new-paper-sensing-climate-justice/2_hu_95c6ba9320f53dd0.webp 1200w"
src="https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/2_hu_80457bffc96cc774.webp"
width="760"
height="546"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Recognising the increasing complexities posed by climate challenges to urban environments, it is crucial to develop holistic capabilities for urban areas to effectively respond to climate-related risks, forming the backbone of sustainable urban planning strategies and demanding a comprehensive understanding of urban climate justice. It requires a thorough examination of how climate change exacerbates social, economic, and environmental inequalities within urban settings, which requires a series of sophisticated spatial modellings and relies on data collected periodically. This paper introduces a novel dual-GNN approach, Multi-Hyper Graph Neural Network (MHGNN), with street view imagery as input. The proposed model integrates a multigraph and a hypergraph to model intricate spatial patterns for classifying urban climate justice. The multigraph component of the MHGNN captures spatial proximity and pair-wise connections between urban areas to assess climate impacts. Meanwhile, the hypergraph component addresses higher-order dependencies by incorporating hyperedges that connect multiple geographic areas based on their similarities, thus capturing the multi-faceted relationships among areas with comparable geographic characteristics. By harnessing the strengths of both multigraph and hypergraph structures, the MHGNN provides a comprehensive understanding of the spatial dynamics of urban climate justice. It achieves nearly a 24% performance improvement compared to conventional spatial modelling methods, establishing it as a valuable tool for researchers and policymakers in this domain. Codes available at GitHub.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/19/new-paper-sensing-climate-justice/3_hu_e379fd4167ed6d06.webp 400w,
/post/2024/12/19/new-paper-sensing-climate-justice/3_hu_6a263cc27c533d5d.webp 760w,
/post/2024/12/19/new-paper-sensing-climate-justice/3_hu_88b4c768e96ae485.webp 1200w"
src="https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/3_hu_e379fd4167ed6d06.webp"
width="760"
height="210"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/19/new-paper-sensing-climate-justice/4_hu_c5a55c2304709542.webp 400w,
/post/2024/12/19/new-paper-sensing-climate-justice/4_hu_350df2937777eb01.webp 760w,
/post/2024/12/19/new-paper-sensing-climate-justice/4_hu_5394aaa90c6b88bb.webp 1200w"
src="https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/4_hu_c5a55c2304709542.webp"
width="760"
height="313"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-scs-climate/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-scs-climate/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/19/new-paper-sensing-climate-justice/page-one_hu_1efc761a9b6c3571.webp 400w,
/post/2024/12/19/new-paper-sensing-climate-justice/page-one_hu_1a066ac0ca57ca36.webp 760w,
/post/2024/12/19/new-paper-sensing-climate-justice/page-one_hu_83337dca38864fb1.webp 1200w"
src="https://ual.sg/post/2024/12/19/new-paper-sensing-climate-justice/page-one_hu_1efc761a9b6c3571.webp"
width="585"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_scs_climate&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liu, Pengyuan and Lei, Binyu and Huang, Weiming and Biljecki, Filip and Wang, Yuan and Li, Siyu and Stouffs, Rudi}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.scs.2024.106016}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sustainable Cities and Society}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{106016}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sensing climate justice: A multi-hyper graph approach for classifying urban heat and flood vulnerability through street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{118}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>A video about our research: Designing cities that feel like home</title><link>https://ual.sg/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/</link><pubDate>Wed, 18 Dec 2024 16:33:03 +0800</pubDate><guid>https://ual.sg/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/</guid><description>&lt;p&gt;The &lt;a href="https://ual.sg/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/"&gt;latest research&lt;/a&gt; of our PhD scholar &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;, has been featured as a video by NUS!&lt;/p&gt;
&lt;p&gt;Please check it below.&lt;/p&gt;
&lt;div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;"&gt;
&lt;iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share; fullscreen" loading="eager" referrerpolicy="strict-origin-when-cross-origin" src="https://www.youtube.com/embed/HPFjeKFT6ME?autoplay=0&amp;amp;controls=1&amp;amp;end=0&amp;amp;loop=0&amp;amp;mute=0&amp;amp;start=0" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" title="YouTube video"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;p&gt;There is also a &lt;a href="https://cde.nus.edu.sg/news-detail/designing-cities-that-feel-like-home/" target="_blank" rel="noopener"&gt;blog post&lt;/a&gt;, which we copy here:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;How do the buildings around us affect our moods and make us feel? From towering glass skyscrapers, to orderly modern residential blocks and buildings with historic facades, the architecture we encounter daily can be a powerful force in shaping our emotions and perceptions.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;“Architecture is about more than just aesthetics - it has a profound psychological impact,” says Xiucheng Liang, a PhD student at the Urban Analytics Lab (UAL) at CDE.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;In this video, Liang introduces a recent study conducted with fellow researchers at the UAL that exposes the relationship between the design of building exteriors and the psychology, emotions, and mental well-being of city residents. Through surveys of nearly 500 participants and cutting-edge AI analysis of over 250,000 building images, the research reveals the surprising ways architecture affects our moods and urban experiences.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;According to Liang, well-designed architecture can inspire emotions like safety, nostalgia, or connection, but it can also lead to feelings of discomfort, anxiety or being overwhelmed. The research provides valuable insights for urban planners and architects, suggesting ways that future cities can be not only functional but also emotionally impactful.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/1_hu_2daaa048c832fab.webp 400w,
/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/1_hu_913306a2b21cf7aa.webp 760w,
/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/1_hu_ab9b53cf7cee79b2.webp 1200w"
src="https://ual.sg/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/1_hu_2daaa048c832fab.webp"
width="760"
height="429"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/2_hu_b2a8c5e7a8f06c0c.webp 400w,
/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/2_hu_421e6cb3b10f0c4b.webp 760w,
/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/2_hu_eab838d9982363eb.webp 1200w"
src="https://ual.sg/post/2024/12/18/a-video-about-our-research-designing-cities-that-feel-like-home/2_hu_b2a8c5e7a8f06c0c.webp"
width="760"
height="429"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Many thanks to the Communications Office of our NUS College of Design and Engineering for promoting our work!&lt;/p&gt;
&lt;p&gt;For more information, check out the &lt;a href="https://ual.sg/publication/2024-bae-building/"&gt;publication&lt;/a&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liang X, Chang JH, Gao S, Zhao T, Biljecki F (2024): Evaluating human perception of building exteriors using street view imagery. Building and Environment, 263: 111875. &lt;a href="https://doi.org/10.1016/j.buildenv.2024.111875" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2024.111875&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-bae-building/2024-bae-building.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_bae_building&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liang, Xiucheng and Chang, Jiat Hwee and Gao, Song and Zhao, Tianhong and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2024.111875}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{111875}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Evaluating human perception of building exteriors using street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{263}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Towards an Integrated Approach for Managing and Streaming 3D Spatial Data at the Component Level in Spatial Data Infrastructures</title><link>https://ual.sg/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/</link><pubDate>Tue, 17 Dec 2024 16:30:22 +0800</pubDate><guid>https://ual.sg/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yu D, Yue P, Wu B, Biljecki F, Chen M, Lu L (2025): Towards an Integrated Approach for Managing and Streaming 3D Spatial Data at the Component Level in Spatial Data Infrastructures. International Journal of Geographical Information Science, 39(4): 847-871. &lt;a href="https://doi.org/10.1080/13658816.2024.2434606" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1080/13658816.2024.2434606&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-ijgis-3-dsdi/2025-ijgis-3-dsdi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/dayu-yu/"&gt;Dayu Yu&lt;/a&gt;, who was a visiting scholar in our research group while he has been doing a PhD at Wuhan University, after which he moved to Nanjing Normal University to join as faculty.
Congratulations on his new publication that is part of his PhD! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;Here is &lt;a href="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/"&gt;more information&lt;/a&gt; about his graduation and subsequent job.&lt;/p&gt;
&lt;p&gt;The paper presents design rationales and a unified conceptual model suitable for component-level management of diverse 3D spatial data.
The model is subsequently mapped to a cloud-optimized encoding schema to effectively manage and deliver massive 3D spatial data within SDIs.
This work provides a scientific exploration that integrates management and services to enable direct streaming of managed 3D spatial data without the need for redundant replication and conversion.
The proposed approach is implemented and evaluated across services, accessibility, analysis cases, visualization, and efficiency.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/1_hu_230918941f952d21.webp 400w,
/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/1_hu_68e9a5419f6d55e6.webp 760w,
/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/1_hu_8d9df19606db2d66.webp 1200w"
src="https://ual.sg/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/1_hu_230918941f952d21.webp"
width="760"
height="467"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Transitions of spatial data infrastructures (SDIs) support applications from 2D landscapes to 3D scenes. The existing methods for describing, managing, and providing services for 3D spatial data often lack coordination and efficiency. Moreover, the added complexity of 3D data structures necessitates novel approaches for component-level management and streaming capabilities. In response, we developed a generic conceptual model suitable for component-level management of diverse 3D spatial data in SDIs and discussed the design rationales and key considerations underlying the model. We formalized the flexible data composition and fine-grained lifecycle management in this model and specified this model at the cloud-optimized encoding level to enable efficient CRUD operations and streaming delivery of massive 3D spatial data. Our approach enabled direct streaming of the managed 3D spatial data without the need for redundant replication. We implemented, evaluated, and discussed the proposed approach in terms of service, accessibility, visualization, analysis cases, and efficiency. The results show that the proposed method is efficient in managing 3D spatial data and enables users to conduct 3D geo-analysis on the basis of specific parts of the data as needed. This work provides a scientific exploration that integrates the management and services of 3D spatial data in SDIs.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-ijgis-3-dsdi/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-ijgis-3-dsdi/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/page-one_hu_4e8437c77b57175f.webp 400w,
/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/page-one_hu_663f5e458803cf5c.webp 760w,
/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/page-one_hu_f35f7cc66209b6bb.webp 1200w"
src="https://ual.sg/post/2024/12/17/new-paper-towards-an-integrated-approach-for-managing-and-streaming-3d-spatial-data-at-the-component-level-in-spatial-data-infrastructures/page-one_hu_4e8437c77b57175f.webp"
width="543"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_ijgis_3dsdi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yu, Dayu and Yue, Peng and Wu, Binwen and Biljecki, Filip and Chen, Min and Lu, Luancheng}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1080/13658816.2024.2434606}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Geographical Information Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{847-871}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{39}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{4}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Towards an integrated approach for managing and streaming 3D spatial data at the component level in spatial data infrastructures}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Understanding the user perspective on urban public spaces</title><link>https://ual.sg/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/</link><pubDate>Thu, 07 Nov 2024 13:58:22 +0800</pubDate><guid>https://ual.sg/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Zhu Y, Zhang Y, Biljecki F (2025): Understanding the user perspective on urban public spaces: A systematic review and opportunities for machine learning. Cities, 156: 105535. &lt;a href="https://doi.org/10.1016/j.cities.2024.105535" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2024.105535&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-cities-userperspective/2025-cities-userperspective.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/yihan-zhu/"&gt;Yihan Zhu&lt;/a&gt;.
Congratulations on his new publication that is part of his PhD! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is available open access.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/1_hu_7828910e1644aca7.webp 400w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/1_hu_7d5ca39bcc1fb294.webp 760w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/1_hu_2cfbfdf4ad7a912e.webp 1200w"
src="https://ual.sg/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/1_hu_7828910e1644aca7.webp"
width="760"
height="329"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Studies on user&amp;rsquo;s perspective of urban public spaces are at the infant stage.&lt;/li&gt;
&lt;li&gt;Researchers study the user&amp;rsquo;s perspective on urban public spaces from ten dimensions.&lt;/li&gt;
&lt;li&gt;Perception interpretation, user demographics and data acquisition are key challenges.&lt;/li&gt;
&lt;li&gt;Machine learning offers significant potential for addressing the above challenges.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/2_hu_7d1f5d09d0959ff2.webp 400w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/2_hu_1c6a8ec9be0c8952.webp 760w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/2_hu_2a9e0b36ceff758c.webp 1200w"
src="https://ual.sg/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/2_hu_7d1f5d09d0959ff2.webp"
width="760"
height="451"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;With people-centered approaches gaining prominence in urban development, studying urban public spaces from the user&amp;rsquo;s perspective has become crucial for effective urban design, planning, and policy-making. The rapid advancement of Machine Learning (ML) techniques has enhanced the ability to analyze and understand user data in urban public spaces, such as usage patterns, activities, and public opinions. However, limited efforts have been made on a structured understanding of urban public spaces from the user&amp;rsquo;s perspective. These knowledge gaps have also hindered the full realization of ML&amp;rsquo;s potential in describing and analyzing urban public spaces. After systematically reviewing 319 relevant papers, this study analyzes ten dimensions of the user&amp;rsquo;s perspective on urban public spaces and identifies three unaddressed issues: (1) interpretation of user&amp;rsquo;s perception, (2) overlooked user demographics, and (3) data acquisition. In addition, this review also examines the applications of ML to these dimensions and their potential to tackle the three issues, and highlights two main opportunities to integrate ML for more rigorous and data-driven public spaces studies: (1) combining Computer Vision and Natural Language Processing in public spaces quality measurement and (2) investing in high-quality user data.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/3_hu_3d388e7234fb6bdb.webp 400w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/3_hu_311638088b519123.webp 760w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/3_hu_ce7db07d4a0fa3b9.webp 1200w"
src="https://ual.sg/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/3_hu_3d388e7234fb6bdb.webp"
width="760"
height="719"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-cities-userperspective/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-cities-userperspective/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/page-one_hu_6212c485fee0a29a.webp 400w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/page-one_hu_2a0f0728acdc326c.webp 760w,
/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/page-one_hu_7bf5c5411fe16211.webp 1200w"
src="https://ual.sg/post/2024/11/07/new-paper-understanding-the-user-perspective-on-urban-public-spaces/page-one_hu_6212c485fee0a29a.webp"
width="581"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_cities_userperspective&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yihan Zhu and Ye Zhang and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2024.105535}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105535}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Understanding the user perspective on urban public spaces: A systematic review and opportunities for machine learning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{156}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Measuring the value of window views using real estate big data and computer vision</title><link>https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/</link><pubDate>Sat, 02 Nov 2024 17:30:22 +0800</pubDate><guid>https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Peng C, Xiang Y, Huang W, Feng Y, Tang Y, Biljecki F, Zhou Z (2025): Measuring the value of window views using real estate big data and computer vision: A case study in Wuhan, China. Cities, 156: 105536. &lt;a href="https://doi.org/10.1016/j.cities.2024.105536" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2024.105536&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-cities-windowviews/2025-cities-windowviews.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/chucai-peng/"&gt;Chucai Peng&lt;/a&gt;.
Congratulations on his new publication that is part of his PhD! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/c/1k0n9y5jOux9L" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-12-20.&lt;/p&gt;
&lt;p&gt;The dataset has been released &lt;a href="https://github.com/yahaha115/Window-view-dataset" target="_blank" rel="noopener"&gt;openly&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/1_hu_81251af7ccc55e63.webp 400w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/1_hu_9ea181826f27c30f.webp 760w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/1_hu_40adefad55ba08b8.webp 1200w"
src="https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/1_hu_81251af7ccc55e63.webp"
width="760"
height="359"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A pervasive, voluminous, and readily accessible semantic dataset of 3041 window views of high-rise residential building.&lt;/li&gt;
&lt;li&gt;Quantifying window view elements using online real estate big images with computer vision.&lt;/li&gt;
&lt;li&gt;Spatial hedonic models and interpretable machine learning methods are developed to measure the value of window view elements.&lt;/li&gt;
&lt;li&gt;Water views positively affect house prices in Wuhan, while views of grass and hard ground have a negative impact.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/2_hu_5a84074572187cdd.webp 400w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/2_hu_4a6da05f09803c5.webp 760w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/2_hu_e826298b36f580e7.webp 1200w"
src="https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/2_hu_5a84074572187cdd.webp"
width="760"
height="614"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Window views significantly influence residential quality and real estate value, particularly in high-rise residential buildings. Previous studies have predominantly focused on water and green views, resulting in a lack of clarity regarding the influence of other types of views on house prices. In this study, we quantified and analyzed the impacts of 9 window view elements, including sky, high-rise buildings, low-rise buildings, trees, grass, water, hard ground, roads, and barren land, on housing prices using online real estate images and computer vision techniques. Focusing on high-rise buildings constructed in the past five years, our findings, based on spatial hedonic pricing models, reveal that an increased proportion of water views through windows has a significant positive effect on property prices. Conversely, the presence of grass and hard ground is associated with significant negative impacts. This study examines the influence of various window view elements on apartment prices, offering valuable insights for urban planning, architectural design, and property development.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/3_hu_c2225507f5bf4ef4.webp 400w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/3_hu_bf240ca3ae631c97.webp 760w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/3_hu_6b0286e8d441032.webp 1200w"
src="https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/3_hu_c2225507f5bf4ef4.webp"
width="760"
height="645"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/4_hu_7758a70c0356d02c.webp 400w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/4_hu_468c19d06faa0448.webp 760w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/4_hu_940d2a9def48106d.webp 1200w"
src="https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/4_hu_7758a70c0356d02c.webp"
width="760"
height="424"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-cities-windowviews/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-cities-windowviews/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/page-one_hu_3bcd97be6dff00c6.webp 400w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/page-one_hu_45bb5d1fe2c65810.webp 760w,
/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/page-one_hu_85ad4f9fb0f6fa41.webp 1200w"
src="https://ual.sg/post/2024/11/02/new-paper-measuring-the-value-of-window-views-using-real-estate-big-data-and-computer-vision/page-one_hu_3bcd97be6dff00c6.webp"
width="578"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_cities_windowviews&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Peng, Chucai and Xiang, Yang and Huang, Wenjing and Feng, Yale and Tang, Yongqi and Biljecki, Filip and Zhou, Zhixiang}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2024.105536}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105536}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Measuring the value of window views using real estate big data and computer vision: A case study in Wuhan, China}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{156}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Examining the causal impacts of the built environment on cycling activities using time-series street view imagery</title><link>https://ual.sg/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/</link><pubDate>Wed, 23 Oct 2024 08:23:22 +0800</pubDate><guid>https://ual.sg/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Ito K, Bansal P, Biljecki F (2024): Examining the causal impacts of the built environment on cycling activities using time-series street view imagery. Transportation Research Part A: Policy and Practice, 190: 104286. &lt;a href="https://doi.org/10.1016/j.tra.2024.104286" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.tra.2024.104286&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-tra-examining/2024-tra-examining.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;.
Congratulations on his continued successful publications during his PhD! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1jzWn3Rd3v3eSF" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-12-10.&lt;/p&gt;
&lt;p&gt;The code and dataset have been released &lt;a href="https://github.com/koito19960406/bike_svi" target="_blank" rel="noopener"&gt;openly&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/1_hu_6995da845dde457.webp 400w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/1_hu_c7e7da5c90313a6.webp 760w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/1_hu_cfb3dc5c526fb6d6.webp 1200w"
src="https://ual.sg/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/1_hu_6995da845dde457.webp"
width="760"
height="425"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Examined causal relationships between street features and cycling mobility.&lt;/li&gt;
&lt;li&gt;Utilized historical SVI data and cyclist count data in London.&lt;/li&gt;
&lt;li&gt;Revealed the effects of urban design on cycling activities.&lt;/li&gt;
&lt;li&gt;Identified heterogeneous treatment effects of urban design interventions.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/2_hu_c7d1d3c611d6809b.webp 400w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/2_hu_b499ffcea3bf5b8a.webp 760w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/2_hu_734d7b5a0e7a4164.webp 1200w"
src="https://ual.sg/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/2_hu_c7d1d3c611d6809b.webp"
width="760"
height="346"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Cycling is vital for sustainable and healthy cities. To encourage such activities, understanding urban bikeability at both detailed and broad spatial scales is crucial. Street view imagery (SVI) offers in-depth insights into how street features influence micro-mobility patterns, but existing studies are mainly correlational. This research utilized historical time-series SVI, cyclist data from London, to discern the causal effects of specific urban features on cyclist numbers. We used propensity score matching to adjust for potential confounding biases and applied the causal forest to estimate the heterogeneity in causal effects. Key findings include: vegetation significantly boosts cycling, slope negatively impacts cycling, and bike lanes positively influence cycling. Moreover, vegetation’s impact on cycling is greater in less populated areas, while bike lanes have a stronger effect in densely populated regions. These findings help prioritize the areas of intervention. By transcending from mere correlations to identifying heterogeneous causal impacts, this study offers invaluable insights for urban planning, underscoring design strategies to enhance cities’ bikeability and sustainability.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/3_hu_d32548c4c915cf1d.webp 400w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/3_hu_c8ff3beff02973b4.webp 760w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/3_hu_d44ebe55d72c09b.webp 1200w"
src="https://ual.sg/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/3_hu_d32548c4c915cf1d.webp"
width="760"
height="410"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-tra-examining/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-tra-examining/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/page-one_hu_9f1beee057de973d.webp 400w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/page-one_hu_9d0871297f531232.webp 760w,
/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/page-one_hu_b2dd383929594422.webp 1200w"
src="https://ual.sg/post/2024/10/23/new-paper-examining-the-causal-impacts-of-the-built-environment-on-cycling-activities-using-time-series-street-view-imagery/page-one_hu_9f1beee057de973d.webp"
width="559"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_tra_examining&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ito, Koichi and Bansal, Prateek and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.tra.2024.104286}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Transportation Research Part A: Policy and Practice}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104286}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Examining the causal impacts of the built environment on cycling activities using time-series street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{190}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visits from Finland, Thailand, Indonesia, Italy, and USA</title><link>https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/</link><pubDate>Tue, 22 Oct 2024 07:11:19 +0800</pubDate><guid>https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/</guid><description>&lt;p&gt;We continue hosting academic visitors from around the world for exciting lectures and productive exchanges.&lt;/p&gt;
&lt;p&gt;In the past month, we had the pleasure of having several visits from overseas for insightful meetings and group discussion sessions or brief exchanges:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://researchportal.helsinki.fi/en/persons/tuuli-toivonen" target="_blank" rel="noopener"&gt;Tuuli Toivonen, University of Helsinki&lt;/a&gt; 🇫🇮&lt;/li&gt;
&lt;li&gt;A group from the &lt;a href="https://eng.cmu.ac.th/" target="_blank" rel="noopener"&gt;Faculty of Engineering, Chiang Mai University&lt;/a&gt; 🇹🇭&lt;/li&gt;
&lt;li&gt;A group from the &lt;a href="https://geodesi.ugm.ac.id/en/english/" target="_blank" rel="noopener"&gt;Department of Geodetic Engineering, Universitas Gadjah Mada&lt;/a&gt; 🇮🇩&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.polito.it/en/staff?p=041312" target="_blank" rel="noopener"&gt;Elisabetta Colucci&lt;/a&gt; from the &lt;a href="https://www.polito.it/en" target="_blank" rel="noopener"&gt;Politecnico di Torino&lt;/a&gt; 🇮🇹&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ogc.org/our-team/scott-simmons/" target="_blank" rel="noopener"&gt;Scott Simmons&lt;/a&gt; from &lt;a href="https://www.ogc.org/" target="_blank" rel="noopener"&gt;The Open Geospatial Consortium (OGC)&lt;/a&gt; 🇺🇸&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Starting the recent series, Professor &lt;a href="https://researchportal.helsinki.fi/en/persons/tuuli-toivonen" target="_blank" rel="noopener"&gt;Tuuli Toivonen&lt;/a&gt; (&lt;a href="https://www.helsinki.fi/en" target="_blank" rel="noopener"&gt;University of Helsinki&lt;/a&gt;) gave a compelling lecture on &lt;em&gt;Greener Urban Travel Environments for Everyone: From Measured Wellbeing Impacts to Big Data Analytics&lt;/em&gt;, during which she shared the impressive work of her &lt;a href="https://www.helsinki.fi/en/researchgroups/digital-geography-lab" target="_blank" rel="noopener"&gt;Digital Geography Lab&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/1_hu_d4786c5f57152662.webp 400w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/1_hu_6eedf8b91bb542b9.webp 760w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/1_hu_4bac9df93e833db3.webp 1200w"
src="https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/1_hu_d4786c5f57152662.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;In early October, we hosted a productive workshop on urban informatics with a group of faculty from the &lt;a href="https://eng.cmu.ac.th/" target="_blank" rel="noopener"&gt;Faculty of Engineering, Chiang Mai University&lt;/a&gt; in Thailand.
A big thank you to the delegation led by Professor &lt;a href="https://cpemis.eng.cmu.ac.th/~santi/" target="_blank" rel="noopener"&gt;Santi Phithakkitnukoon&lt;/a&gt; (Head of the &lt;a href="http://cpe.eng.cmu.ac.th/" target="_blank" rel="noopener"&gt;Department of Computer Engineering&lt;/a&gt;) for visiting us from Thailand and sharing their work!
We have &lt;a href="https://ual.sg/publication/2024-pt-g-2-viz/"&gt;ongoing collaborations&lt;/a&gt; with Prof Santi and his &lt;a href="https://www.citycontext.info/" target="_blank" rel="noopener"&gt;City Context Lab&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The workshop featured presentations from both sides and a tour of our net-zero energy building.
We are always excited to learn and collaborate with fellow researchers across Southeast Asia 🌏.
Thanks also to &lt;a href="https://www.bedrockanalytics.com/" target="_blank" rel="noopener"&gt;Bedrock Analytics&lt;/a&gt; for taking part!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/2_hu_1ed1d7df94c259ee.webp 400w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/2_hu_19fef961a1e0529e.webp 760w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/2_hu_66f31c5c8e2d9003.webp 1200w"
src="https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/2_hu_1ed1d7df94c259ee.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Next, we are glad to have hosted a large delegation from &lt;a href="https://ugm.ac.id/" target="_blank" rel="noopener"&gt;Universitas Gadjah Mada (UGM)&lt;/a&gt; in Indonesia for a workshop on urban digital twins.
We exchanged experiences in research and teaching with faculty from the &lt;a href="https://geodesi.ugm.ac.id/en/english/" target="_blank" rel="noopener"&gt;Department of Geodetic Engineering&lt;/a&gt;, led by their Head, Professor &lt;a href="https://acadstaff.ugm.ac.id/triasaditya" target="_blank" rel="noopener"&gt;Trias Aditya&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;It was quite insightful to learn about the multitude of activities conducted by one of the most prominent departments in Indonesia and Southeast Asia in our domain.
In turn, they got to know more about our research group and multiple departments at NUS, and our programmes such as the &lt;a href="https://cde.nus.edu.sg/arch/programmes/master-of-urban-planning/" target="_blank" rel="noopener"&gt;Master of Urban Planning&lt;/a&gt; and &lt;a href="https://fass.nus.edu.sg/geog/msc-in-applied-gis/" target="_blank" rel="noopener"&gt;MSc in Applied GIS&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/3_hu_3ea257335e2cd469.webp 400w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/3_hu_9563ec8494ac9ce6.webp 760w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/3_hu_8e18ac44a416d944.webp 1200w"
src="https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/3_hu_3ea257335e2cd469.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;For the first time, we had a visit from Italy &amp;ndash; Dr &lt;a href="https://www.polito.it/en/staff?p=041312" target="_blank" rel="noopener"&gt;Elisabetta Colucci&lt;/a&gt; from the &lt;a href="https://www.dad.polito.it/en/" target="_blank" rel="noopener"&gt;Department of Architecture and Design&lt;/a&gt; at the &lt;a href="https://www.polito.it/en" target="_blank" rel="noopener"&gt;Politecnico di Torino&lt;/a&gt;.
She has delivered an insightful lecture &lt;em&gt;From Built Heritage 3D Spatial Documentation to National Hazards Maps: a multi-scale &amp;amp; multi-techniques research approach&lt;/em&gt;.
The lecture was part of our &lt;a href="https://ual.sg/seminars"&gt;seminars series&lt;/a&gt; and department lectures.&lt;/p&gt;
&lt;p&gt;The programme included also a visit to the &lt;a href="https://www.arclabnus.com/" target="_blank" rel="noopener"&gt;Architectural Conservation Laboratory (ArClab)&lt;/a&gt;, which is part of our department.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/4_hu_8f3182f73e9c4b19.webp 400w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/4_hu_e5d32ca435af15f4.webp 760w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/4_hu_f8dfdf8512f6f1b2.webp 1200w"
src="https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/4_hu_8f3182f73e9c4b19.webp"
width="760"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Finally, it was a pleasure to have &lt;a href="https://www.ogc.org/our-team/scott-simmons/" target="_blank" rel="noopener"&gt;Scott Simmons&lt;/a&gt;, Chief Standards Officer of &lt;a href="https://www.ogc.org/" target="_blank" rel="noopener"&gt;The Open Geospatial Consortium (OGC)&lt;/a&gt;, visit our research group.
We have been &lt;a href="https://ual.sg/post/2020/09/23/the-pi-of-the-lab-elected-co-chair-of-the-open-geospatial-consortium-3dim-dwg/"&gt;an active member&lt;/a&gt; of the OGC for a long time, and we are glad to continue our involvement and collaboration. 🌐&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/5_hu_a0f8df8c42330dd1.webp 400w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/5_hu_5a8289bffaa457aa.webp 760w,
/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/5_hu_39e8a022770d9d35.webp 1200w"
src="https://ual.sg/post/2024/10/22/visits-from-finland-thailand-indonesia-italy-and-usa/5_hu_a0f8df8c42330dd1.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Thanks everyone!&lt;/p&gt;</description></item><item><title>New paper: A perception-powered urban digital twin to support human-centered urban planning and sustainable city development</title><link>https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/</link><pubDate>Mon, 21 Oct 2024 17:15:22 +0800</pubDate><guid>https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Luo J, Liu P, Xu W, Zhao T, Biljecki F (2025): A perception-powered urban digital twin to support human-centered urban planning and sustainable city development. Cities, 156: 105473. &lt;a href="https://doi.org/10.1016/j.cities.2024.105473" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2024.105473&lt;/a&gt; &lt;a href="https://ual.sg/publication/2025-cities-perception-dt/2025-cities-perception-dt.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt; from Zhejiang A&amp;amp;F University, who was previously a researcher in our Lab.
Congratulations and thank you for the continued productive collaboration! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1jzIFy5jOuw-R" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-12-10.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/1_hu_199b56cb56b31988.webp 400w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/1_hu_1800fca3af1f8197.webp 760w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/1_hu_430ae4bba6cd32d5.webp 1200w"
src="https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/1_hu_199b56cb56b31988.webp"
width="760"
height="468"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/2_hu_48e30818f35c7b00.webp 400w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/2_hu_5266fa34af9a8533.webp 760w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/2_hu_14a3b47857fac278.webp 1200w"
src="https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/2_hu_48e30818f35c7b00.webp"
width="760"
height="371"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Integrating human visual perceptions into urban digital twins (UDTs).&lt;/li&gt;
&lt;li&gt;Developing public-centered UDTs for automatic perception simulations.&lt;/li&gt;
&lt;li&gt;Incorporating human-in-the-loop subjective perceptions via immersive virtual reality.&lt;/li&gt;
&lt;li&gt;Automating perceptions through photo-realistic scenario simulations.&lt;/li&gt;
&lt;li&gt;Implementing UDTs in a greenway setting to enhance citizen participation and collaboration.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/3_hu_e318227054fdc2a2.webp 400w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/3_hu_29fbf69be8b73e8b.webp 760w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/3_hu_a666fc6184845736.webp 1200w"
src="https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/3_hu_e318227054fdc2a2.webp"
width="760"
height="235"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/4_hu_a5729af3ed2e22f5.webp 400w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/4_hu_f1256d3fb9154173.webp 760w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/4_hu_56075157174d4517.webp 1200w"
src="https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/4_hu_a5729af3ed2e22f5.webp"
width="760"
height="453"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban Digital Twins (UDTs) offer a promising avenue for advancing sustainable urban development by mirroring physical environments and complex urban dynamics. Such technology enables urban planners to predict and analyze the impacts of various urban scenarios, addressing a global priority for sustainable urban environments. However, their potential in public engagement for environmental perception remains unfulfilled, with existing research lacking the capability to analyze urbanscapes&amp;rsquo; visual features and predict public perceptions based on photo-realistic renderings. To fill the gap, our study developed and implemented a UDT platform designed for the dual purposes of objective feature evaluation and subjective visual perception, alongside the prediction of perceptions in simulated scenarios. We incorporated DeepLabV3, a deep learning model for imagery semantic segmentation, to quantify a series of visual features within the built environment, such as the proportion of vegetation and architectural elements. Subjective visual perceptions (e.g. safety and lively) are captured using immersive virtual reality to gather public perceptions of different scenarios and learn patterns. Further, utilizing a photo-realistic rendering engine, high-quality renderings of textures and materials for UDT were achieved, and we proved their veracity based on a perception experiment. Afterwards, we employ the random forest algorithm for automated perception predictions of rendering scenarios. The implementation was demonstrated with a case study on an urban greenway in the central area of Singapore. We compared both the objective evaluation and subjective perception results, followed by a demonstration of automated visual perception prediction through photo-realistic scenario simulations, such as modifying vegetation density or introducing new architectural elements to the skyline, to predict the perception of scenarios before they are built, leading to more efficient and automated urban planning.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/5_hu_6bde048acd94c734.webp 400w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/5_hu_80d41f6aa4edebd0.webp 760w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/5_hu_45c8c76575de480a.webp 1200w"
src="https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/5_hu_6bde048acd94c734.webp"
width="760"
height="685"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2025-cities-perception-dt/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2025-cities-perception-dt/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/page-one_hu_e81683d98e78208b.webp 400w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/page-one_hu_4449b4da08b9e2eb.webp 760w,
/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/page-one_hu_215717c28e765ed2.webp 1200w"
src="https://ual.sg/post/2024/10/21/new-paper-a-perception-powered-urban-digital-twin-to-support-human-centered-urban-planning-and-sustainable-city-development/page-one_hu_e81683d98e78208b.webp"
width="586"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2025_cities_perception_dt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Luo, Junjie and Liu, Pengyuan and Xu, Wenhui and Zhao, Tianhong and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2024.105473}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105473}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{A perception-powered urban digital twin to support human-centered urban planning and sustainable city development}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{156}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2025}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Nighttime Street View Imagery</title><link>https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/</link><pubDate>Sat, 19 Oct 2024 09:43:22 +0800</pubDate><guid>https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Fan Z, Biljecki F (2024): Nighttime Street View Imagery: A new perspective for sensing urban lighting landscape. Sustainable Cities and Society, 116: 105862. &lt;a href="https://doi.org/10.1016/j.scs.2024.105862" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.scs.2024.105862&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-scs-night-svi/2024-scs-night-svi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/zicheng-fan/"&gt;Zicheng Fan&lt;/a&gt;.
Congratulations on his first PhD journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The dataset has been released openly at &lt;a href="https://github.com/fzc961020/Nighttime-SVI" target="_blank" rel="noopener"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1jy7F7sfVZE4FF" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-12-06.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/1_hu_a0558f29e33885bc.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/1_hu_dc7c39bbaffc2f77.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/1_hu_86cfb9f23c0aa18.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/1_hu_a0558f29e33885bc.webp"
width="760"
height="308"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/2_hu_423397ed8bf8aff5.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/2_hu_8ed06beb08cd73bf.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/2_hu_87d91834e00149a2.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/2_hu_423397ed8bf8aff5.webp"
width="760"
height="737"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Street View Imagery (SVI) is taken almost exclusively during daytime, ignoring the urban nightscape.&lt;/li&gt;
&lt;li&gt;Elucidation of data collection and usability of such imagery during night.&lt;/li&gt;
&lt;li&gt;Exploring the profound correspondence between daytime SVI and nighttime SVI.&lt;/li&gt;
&lt;li&gt;Identifying nighttime SVI as viable complement for Nighttime Lights satellite imagery.&lt;/li&gt;
&lt;li&gt;Position that nighttime SVI is a latent but valuable urban dataset.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/3_hu_28ac94c2bc6e0c82.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/3_hu_80c97cb5945c5b8e.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/3_hu_b957ccd675f85cc4.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/3_hu_28ac94c2bc6e0c82.webp"
width="760"
height="436"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/4_hu_5df28e8d41f3254d.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/4_hu_63624854095afa1c.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/4_hu_8aab061fab4fdcbf.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/4_hu_5df28e8d41f3254d.webp"
width="760"
height="635"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban lighting reflects nocturnal activities and it is traditionally observed using Nighttime Lights (NTL) satellite imagery. Few studies systematically measure the nightscape from a human perspective. This study brings a new paradigm — urban lighting sensing via Nighttime Street View Imagery (SVI). The paradigm draws on the accomplishments of (daytime) SVI and gives attention to its ignored nighttime counterpart. We put forward this idea by manually collecting 2,831 nighttime SVIs across various urban functional areas in Singapore. We investigated their values by developing a use case for clustering nighttime lighting patterns. To mitigate the scarcity of nighttime SVI, deep learning regression models were trained to predict nighttime brightness based on corresponding daytime SVIs obtained from widely available sources. The results were compared with brightness data derived from satellite imagery, to affirm the novelty and uniqueness of nighttime SVI. As a result, there are 7 lighting patterns within the collected nighttime SVI, distinct in lighted spot features and total brightness. The identified patterns effectively characterize different urban function scenarios. The best trained brightness prediction model performs well in revealing the city-scale lighting landscape. The SVI-predicted brightness shows a distribution similar to the brightness from satellite imagery and complements it in urban areas with complex vertical lighting structures. This study demonstrates the potential of nighttime SVI as a valuable data source for mapping urban lighting and activities, offering advantages over satellite data. The proposed paradigm contributes significantly to cross-modal information mining in urban studies and has potential applications in scenarios such as light pollution mitigation and crime prevention.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/5_hu_82eecb95f6ca035f.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/5_hu_45fea101efe18272.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/5_hu_168c1b0958799e67.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/5_hu_82eecb95f6ca035f.webp"
width="760"
height="695"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/6_hu_2b045d9384b159e0.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/6_hu_8d4676930b0db527.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/6_hu_22943c8304cc46dd.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/6_hu_2b045d9384b159e0.webp"
width="760"
height="377"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-scs-night-svi/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-scs-night-svi/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/10/19/new-paper-nighttime-street-view-imagery/page-one_hu_63ddc652dfa71555.webp 400w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/page-one_hu_8ca477fd5147b732.webp 760w,
/post/2024/10/19/new-paper-nighttime-street-view-imagery/page-one_hu_78e19700b2096a38.webp 1200w"
src="https://ual.sg/post/2024/10/19/new-paper-nighttime-street-view-imagery/page-one_hu_63ddc652dfa71555.webp"
width="564"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_scs_nightSVI&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Fan, Zicheng and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.scs.2024.105862}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sustainable Cities and Society}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105862}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Nighttime Street View Imagery: A new perspective for sensing urban lighting landscape}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{116}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: A panorama-based technique to estimate sky view factor and solar irradiance considering transmittance of tree canopies</title><link>https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/</link><pubDate>Sun, 15 Sep 2024 19:05:22 +0800</pubDate><guid>https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Fujiwara K, Ito K, Ignatius M, Biljecki F (2024): A panorama-based technique to estimate sky view factor and solar irradiance considering transmittance of tree canopies. Building and Environment, 266: 112071. &lt;a href="https://doi.org/10.1016/j.buildenv.2024.112071" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2024.112071&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-bae-svf/2024-bae-svf.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/kunihiko-fujiwara/"&gt;Kunihiko Fujiwara&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;This paper is another outcome of &lt;a href="https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/"&gt;our collaboration with Takenaka Corporation&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The code has been released open-source at &lt;a href="https://github.com/kunifujiwara/TreeShadeMapper" target="_blank" rel="noopener"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/1_hu_945ae0f3f4306dff.webp 400w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/1_hu_35934345bbb757b5.webp 760w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/1_hu_3c9bf90b2ee59856.webp 1200w"
src="https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/1_hu_945ae0f3f4306dff.webp"
width="760"
height="379"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/2_hu_d19e8cffa4b250ed.webp 400w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/2_hu_12b6715c462ade4b.webp 760w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/2_hu_5625aece9222b65b.webp 1200w"
src="https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/2_hu_d19e8cffa4b250ed.webp"
width="760"
height="323"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Advancing estimation of sky view factor and solar irradiance using panoramic imagery.&lt;/li&gt;
&lt;li&gt;Detecting shading objects based on semantic segmentation of street-level panoramas.&lt;/li&gt;
&lt;li&gt;Estimating transmittance of tree canopies using image binarization.&lt;/li&gt;
&lt;li&gt;Solar irradiance estimated with MAE of 77.8 Wm−2, RMSE of 105.0 Wm−2, and R of 0.90.&lt;/li&gt;
&lt;li&gt;Demonstration of high-resolution mapping and walking route optimization.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/3_hu_bd053b9891ee10b7.webp 400w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/3_hu_c97ce3dfeda6fcd5.webp 760w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/3_hu_206ae5f885eb83a1.webp 1200w"
src="https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/3_hu_bd053b9891ee10b7.webp"
width="760"
height="250"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/4_hu_8e65da72c700b7f5.webp 400w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/4_hu_79e5a529c4acbb85.webp 760w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/4_hu_1464fd4ac3f5e47f.webp 1200w"
src="https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/4_hu_8e65da72c700b7f5.webp"
width="680"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Street-view-based techniques for assessing the sky view factor (SVF) and solar irradiance under trees are gaining attention as tools for evaluating trees as nature-based solutions to mitigate urban heat risks. Although these metrics significantly depend on the morphology of trees and resulting canopy transmittance, an existing approach, termed the Solid Canopy Method (SCM), assumes zero transmission and has not accounted for these variations. This paper advances the computation of both metrics, improving their accuracy and application — we developed the Transmissive Canopy Method (TCM), a panorama-based approach that integrates semantic segmentation and binarization to evaluate SVF and solar irradiance while accounting for transmittance of tree canopies. Using a study area on a university campus in Singapore, we collected data on solar irradiance and 360°imagery to validate our method. The results indicated improved accuracy with MAE, RMSE, and R values of 77.8 Wm−2, 105.0 Wm−2 and 0.90, respectively — significantly outperforming the SCM. We showcased two use cases of our method: (1) high-resolution mapping of SVF and solar irradiance in a field with trees, and (2) walking route optimization considering sunlight exposure. Our findings highlight the strong capability of our TCM to evaluate the effects of trees in mitigating urban heat more accurately than the existing method. Additionally, the TCM has potential applications in urban planning and management, enabling strategic tree planting prioritizing areas lacking sufficient shading and developing tools for optimizing walking routes to minimize sunlight exposure.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/5_hu_82da785cb6c4c8e9.webp 400w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/5_hu_1f4f5d0352ef14c1.webp 760w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/5_hu_e8bd2e2100a814af.webp 1200w"
src="https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/5_hu_82da785cb6c4c8e9.webp"
width="760"
height="429"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-bae-svf/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-bae-svf/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/page-one_hu_c5fc43bf9dcdafd.webp 400w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/page-one_hu_a5514fd0d0a5341b.webp 760w,
/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/page-one_hu_13f54fa5c7cd39c4.webp 1200w"
src="https://ual.sg/post/2024/09/15/new-paper-a-panorama-based-technique-to-estimate-sky-view-factor-and-solar-irradiance-considering-transmittance-of-tree-canopies/page-one_hu_c5fc43bf9dcdafd.webp"
width="585"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_bae_svf&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Fujiwara, Kunihiko and Ito, Koichi and Ignatius, Marcel and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2024.112071}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112071}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{A panorama-based technique to estimate sky view factor and solar irradiance considering transmittance of tree canopies}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{266}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Translating street view imagery to correct perspectives to enhance bikeability and walkability studies</title><link>https://ual.sg/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/</link><pubDate>Sun, 01 Sep 2024 22:23:22 +0800</pubDate><guid>https://ual.sg/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Ito K, Quintana M, Han X, Zimmermann R, Biljecki F (2024): Translating street view imagery to correct perspectives to enhance bikeability and walkability studies. International Journal of Geographical Information Science, 38(12): 2514-2544. &lt;a href="https://doi.org/10.1080/13658816.2024.2391969" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1080/13658816.2024.2391969&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-ijgis-svi-gan/2024-ijgis-svi-gan.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The work challenges studies that have used car-centric perspectives of street view imagery for understanding walkability and bikeability and offers a solution.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/1_hu_3146b00a8b53b498.webp 400w,
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/1_hu_f148982e5e721ec9.webp 760w,
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/1_hu_d6de4fc6507d5a02.webp 1200w"
src="https://ual.sg/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/1_hu_3146b00a8b53b498.webp"
width="760"
height="430"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/2_hu_a6a8948e6f78e5d4.webp 400w,
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/2_hu_5a3f03e09008c706.webp 760w,
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/2_hu_96fcae5e47b127fc.webp 1200w"
src="https://ual.sg/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/2_hu_a6a8948e6f78e5d4.webp"
width="760"
height="436"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Street view imagery (SVI), an emerging geospatial dataset, is useful for evaluating active transportation infrastructure, but it faces potential biases from its vehicle-based capture method, diverging from pedestrians’ and cyclists’ perspectives. Existing literature lacks both an examination of these biases and a solution. This study identifies and quantifies these biases by comparing conventional SVI with views from the road shoulder/sidewalk. To mitigate such perspective biases, we introduce a novel framework with generative adversarial network (GAN)-based image generation models (Pix2Pix and CycleGAN), an image regression model (ResNet-50), and a tabular model (LightGBM). Experiments assessed model effectiveness in translating car-centric views to those from pedestrian and cyclist perspectives. Results show significant differences in semantic indicators (e.g. green view index) between road center and road shoulder/sidewalk SVI, with low Pearson’s correlation coefficients r (0.35–0.55 for road shoulders and 0.45–0.47 for sidewalks) indicating bias. The framework succeeded in creating realistic images and aligning pixel ratios between perspectives, achieving strong correlation coefficients (0.81 for road shoulders and 0.83 for sidewalks), thus reducing bias. This work contributes by providing a scalable and model-agnostic approach to produce accurate SVIs for urban planning and sustainability, setting a foundation for improving bikeability and walkability assessments and promoting active transportation.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-ijgis-svi-gan/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-ijgis-svi-gan/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/page-one_hu_2ea538c237753712.webp 400w,
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/page-one_hu_26a25f57dbc28732.webp 760w,
/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/page-one_hu_aa1483991ad85367.webp 1200w"
src="https://ual.sg/post/2024/09/01/new-paper-translating-street-view-imagery-to-correct-perspectives-to-enhance-bikeability-and-walkability-studies/page-one_hu_2ea538c237753712.webp"
width="505"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_ijgis_svi_gan&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ito, Koichi and Quintana, Matias and Han, Xianjing and Zimmermann, Roger and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1080/13658816.2024.2391969}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Geographical Information Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{38}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{12}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2514-2544}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Translating street view imagery to correct perspectives to enhance bikeability and walkability studies}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Microclimate vision</title><link>https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/</link><pubDate>Sun, 18 Aug 2024 09:23:22 +0800</pubDate><guid>https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Fujiwara K, Khomiakov M, Yap W, Ignatius M, Biljecki F (2024): Microclimate Vision: Multimodal prediction of climatic parameters using street-level and satellite imagery. Sustainable Cities and Society, 114: 105733. &lt;a href="https://doi.org/10.1016/j.scs.2024.105733" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.scs.2024.105733&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-scs-microclimate-vision/2024-scs-microclimate-vision.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/kunihiko-fujiwara/"&gt;Kunihiko Fujiwara&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;
We are glad that this paper is one of the first outcomes of &lt;a href="https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/"&gt;our collaboration with Takenaka Corporation&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The code has been released open-source at &lt;a href="https://github.com/kunifujiwara/microclimate-vision" target="_blank" rel="noopener"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/18/new-paper-microclimate-vision/1_hu_28d1bb03466c5e4b.webp 400w,
/post/2024/08/18/new-paper-microclimate-vision/1_hu_d5c1bdf71071c619.webp 760w,
/post/2024/08/18/new-paper-microclimate-vision/1_hu_eac00dfb7b0fa808.webp 1200w"
src="https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/1_hu_28d1bb03466c5e4b.webp"
width="760"
height="479"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/18/new-paper-microclimate-vision/2_hu_9ac66bb1bb83c4d.webp 400w,
/post/2024/08/18/new-paper-microclimate-vision/2_hu_a0456318b9610cff.webp 760w,
/post/2024/08/18/new-paper-microclimate-vision/2_hu_972a785e5864bde4.webp 1200w"
src="https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/2_hu_9ac66bb1bb83c4d.webp"
width="730"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Prediction of microclimate parameters using street-level and satellite imagery.&lt;/li&gt;
&lt;li&gt;Multimodal prediction model combining LSTM and ResNet-18 architectures.&lt;/li&gt;
&lt;li&gt;Collecting and using microclimate data, street-level imagery, and satellite imagery.&lt;/li&gt;
&lt;li&gt;High accuracy: RMSE at 0.95 °C (T_air), 2.57% (RH), 0.31 m/s (v), 225 W/m2 (GHI).&lt;/li&gt;
&lt;li&gt;Street-level and satellite imagery contribute to improving accuracy.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/18/new-paper-microclimate-vision/3_hu_fce532c7e1384bec.webp 400w,
/post/2024/08/18/new-paper-microclimate-vision/3_hu_d152e68dca070c3e.webp 760w,
/post/2024/08/18/new-paper-microclimate-vision/3_hu_e486a0871149a9a9.webp 1200w"
src="https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/3_hu_fce532c7e1384bec.webp"
width="760"
height="544"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/18/new-paper-microclimate-vision/4_hu_c7623c2e42275dd7.webp 400w,
/post/2024/08/18/new-paper-microclimate-vision/4_hu_574823dcaec1793b.webp 760w,
/post/2024/08/18/new-paper-microclimate-vision/4_hu_eb3cba4720aa3c8d.webp 1200w"
src="https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/4_hu_c7623c2e42275dd7.webp"
width="760"
height="750"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;High-resolution microclimate data is essential for capturing spatio-temporal heterogeneity of urban climate and heat health management. However, previous studies have relied on dense measurements that require significant costs for equipment, or on physical simulations demanding intensive computational loads. As a potential alternative to these methods, we propose a multimodal deep learning model to predict microclimate at a high spatial and temporal resolution based on street-level and satellite imagery. This model consists of LSTM and ResNet-18 architectures, and predicts air temperature (T_air), relative humidity (RH), wind speed (v), and global horizontal irradiance (GHI). For our study area situated at a university campus in Singapore, we collected microclimate data, street-level and satellite imagery. We conducted extensive experiments with our collected dataset to showcase our model’s predictive capabilities and its practical use in generating high-resolution microclimate maps. Our model reported RMSE at 0.95 °C for T_air, 2.57% for RH, 0.31 m/s for v, and 225 W/m2 for GHI. Furthermore, we observed a contribution of imagery inputs to higher accuracy by comparing models with and without such inputs. We identified hot spots at a high spatio-temporal resolution, indicating its application for issuing real-time heat alerts. Our models are released openly at the microclimate-vision GitHub repository (&lt;a href="https://github.com/kunifujiwara/microclimate-vision%29" target="_blank" rel="noopener"&gt;https://github.com/kunifujiwara/microclimate-vision)&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/18/new-paper-microclimate-vision/5_hu_617126c7551b8769.webp 400w,
/post/2024/08/18/new-paper-microclimate-vision/5_hu_fd2b18699704f5cf.webp 760w,
/post/2024/08/18/new-paper-microclimate-vision/5_hu_f06f0118501fd33c.webp 1200w"
src="https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/5_hu_617126c7551b8769.webp"
width="760"
height="517"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-scs-microclimate-vision/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-scs-microclimate-vision/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/18/new-paper-microclimate-vision/page-one_hu_3ca30324383dd3d7.webp 400w,
/post/2024/08/18/new-paper-microclimate-vision/page-one_hu_afe9da3bf4bfa69f.webp 760w,
/post/2024/08/18/new-paper-microclimate-vision/page-one_hu_6a10b22abcedab15.webp 1200w"
src="https://ual.sg/post/2024/08/18/new-paper-microclimate-vision/page-one_hu_3ca30324383dd3d7.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_scs_microclimate_vision&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Fujiwara, Kunihiko and Khomiakov, Maxim and Yap, Winston and Ignatius, Marcel and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.scs.2024.105733}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sustainable Cities and Society}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105733}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Microclimate Vision: Multimodal prediction of climatic parameters using street-level and satellite imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{114}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Nine PhD graduations of our recent visiting scholars</title><link>https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/</link><pubDate>Mon, 12 Aug 2024 15:02:48 +0800</pubDate><guid>https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/</guid><description>&lt;p&gt;In the past few years we hosted several visiting scholars from foreign universities at our research group in Singapore, resulting in many awesome collaborations and projects.&lt;/p&gt;
&lt;p&gt;Recently, 9 of them have finished their PhD degrees around the world, obtaining doctoral degrees from universities in mainland China, USA, Denmark, Netherlands, France, and Hong Kong. 🇨🇳🇺🇸🇩🇰🇳🇱🇫🇷🇭🇰&lt;/p&gt;
&lt;p&gt;They are featured below.&lt;/p&gt;
&lt;p&gt;Congratulations!
It has been a pleasure to have them in our research group and follow their successes and subsequent work.
We are very proud of their accomplishments, thank them for their collaboration (which we hope to continue), and wish them the best in the continuation of their careers.&lt;/p&gt;
&lt;p&gt;We continuously welcome visiting scholars.
If you are looking into such a visit, please read our &lt;a href="https://ual.sg/opportunities/application-guide/"&gt;Guide for prospective applicants&lt;/a&gt;.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="tianhong-zhao-shenzhen-university"&gt;Tianhong Zhao, Shenzhen University&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/tianhong-phd_hu_60826d280f251245.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/tianhong-phd_hu_28ff90a50e33ce06.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/tianhong-phd_hu_c5e8662a6f1dc4f4.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/tianhong-phd_hu_60826d280f251245.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt; obtained his PhD in Urban Informatics at Shenzhen University on &lt;em&gt;Research on data-driven approaches for modeling urban public transit demand&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;During his stay in our Lab, we collaborated on a variety of topics, for example, on &lt;a href="https://ual.sg/publication/2023-ceus-soundscapes/"&gt;Sensing urban soundscapes from street view imagery&lt;/a&gt;, which got published in CEUS. Tianhong&amp;rsquo;s publications are available &lt;a href="https://scholar.google.com/citations?user=zKBGvToAAAAJ&amp;amp;hl=zh-CN" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Tianhong is now a Lecturer at Shenzhen Technology University.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="xiaofan-liang-georgia-institute-of-technology"&gt;Xiaofan Liang, Georgia Institute of Technology&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/xiaofan-phd_hu_9d8929113ce57c3c.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/xiaofan-phd_hu_815e42aa8449de7e.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/xiaofan-phd_hu_a883ad2d6a023163.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/xiaofan-phd_hu_9d8929113ce57c3c.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/xiaofan-liang/"&gt;Xiaofan Liang&lt;/a&gt; defended her thesis &lt;em&gt;&lt;a href="https://repository.gatech.edu/entities/publication/17423254-589d-4fc3-84a3-e66b05eab76e" target="_blank" rel="noopener"&gt;Connectivity for whom and at what cost: contesting network infrastructure duality in urban planning&lt;/a&gt;&lt;/em&gt; and was awarded a PhD in City and Regional Planning at the Georgia Institute of Technology.
The research was conducted in our sister lab, the &lt;a href="http://friendlycities.gatech.edu" target="_blank" rel="noopener"&gt;Friendly Cities Lab&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;During her stay in our research group, we worked on &lt;a href="https://www.xiaofanliang.com/project/sipoi/" target="_blank" rel="noopener"&gt;analysing the social infrastructure near subway stations&lt;/a&gt;, at the global scale and using OpenStreetMap.
Read more about Xiaofan&amp;rsquo;s work on &lt;a href="https://www.xiaofanliang.com" target="_blank" rel="noopener"&gt;her personal website&lt;/a&gt;, while her publications are &lt;a href="https://scholar.google.com/citations?user=fMkIGgMAAAAJ&amp;amp;hl=en&amp;amp;oi=ao" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Xiaofan is now an Assistant Professor of Urban and Regional Planning at Taubman College of Architecture &amp;amp; Urban Planning, University of Michigan - Ann Arbor.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="benjamin-beaucamp-centrale-nantes"&gt;Benjamin Beaucamp, Centrale Nantes&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/benjamin-phd_hu_b1d9a7cacd207099.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/benjamin-phd_hu_e87608490f300c66.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/benjamin-phd_hu_5644b35145efdda5.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/benjamin-phd_hu_b1d9a7cacd207099.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://fr.linkedin.com/in/benjamin-beaucamp/en" target="_blank" rel="noopener"&gt;Benjamin Beaucamp&lt;/a&gt; defended his PhD thesis &lt;em&gt;&lt;a href="https://www.ec-nantes.fr/medias/fichier/resume_1717069935150-pdf" target="_blank" rel="noopener"&gt;Automatic assessment of the visual perception of the urban space by pedestrians&lt;/a&gt;&lt;/em&gt; at Centrale Nantes.
He conducted this research in the &lt;a href="https://aau.archi.fr/crenau/" target="_blank" rel="noopener"&gt;Urban Architecture Nantes Research Centre (CRENAU)&lt;/a&gt;, part of the &lt;a href="https://aau.archi.fr" target="_blank" rel="noopener"&gt;Architectural and Urban Ambiances Laboratory (AAU)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;While he was in our Lab, we worked on understanding the perspective bias in street view imagery and are continuing our collaboration, which includes also the preparation of a new kind of SVI dataset.&lt;/p&gt;
&lt;p&gt;Benjamin is now a Postdoc in the &lt;a href="https://geoloc.univ-gustave-eiffel.fr" target="_blank" rel="noopener"&gt;Geoloc lab&lt;/a&gt; at the Université Gustave Eiffel, which is known for its urban planning education and research.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="cai-wu-university-of-twente"&gt;Cai Wu, University of Twente&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/cai-phd_hu_88061b8c66d7a5ae.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/cai-phd_hu_3668f3384b8806e7.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/cai-phd_hu_15bf447ee27d6d12.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/cai-phd_hu_88061b8c66d7a5ae.webp"
width="760"
height="529"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/cai-wu/"&gt;Cai Wu&lt;/a&gt; just &lt;a href="https://www.utwente.nl/en/education/tgs/currentcandidates/phd/calendar/2024/6/1563394/phd-defence-cai-wu-a-spatial-driven-urban-pattern-language-framework-for-design-and-planning" target="_blank" rel="noopener"&gt;received his PhD&lt;/a&gt; from the University of Twente, defending his thesis &lt;em&gt;&lt;a href="https://research.utwente.nl/en/publications/a-spatial-data-driven-urban-pattern-language-framework-for-design" target="_blank" rel="noopener"&gt;A spatial data-driven urban pattern language framework for design and planning&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;During his stay in our research group, we worked on a variety of topics related to walkability and urban morphology.
The full list of his publications is available &lt;a href="https://scholar.google.com/citations?user=jdnF-JYAAAAJ&amp;amp;hl=en" target="_blank" rel="noopener"&gt;here&lt;/a&gt;, while more information about his research is available on &lt;a href="https://wucai.me" target="_blank" rel="noopener"&gt;his personal website&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Cai is soon joining The Hong Kong University of Science and Technology (Guangzhou) as Assistant Professor.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="yan-zhang-wuhan-university"&gt;Yan Zhang, Wuhan University&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/yan-phd_hu_708640ae9bfa4822.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/yan-phd_hu_f5ed0fc10e5e4501.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/yan-phd_hu_f9819c74107c9a76.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/yan-phd_hu_708640ae9bfa4822.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/yan-zhang/"&gt;Yan Zhang&lt;/a&gt; defended his PhD thesis &lt;em&gt;A multi-scale spatio-temporal sensing method for urban function zone based on street view images&lt;/em&gt; at Wuhan University.&lt;/p&gt;
&lt;p&gt;During his stay in our group, we worked on advancing urban analytics using street view imagery.
One of the efforts was published as a paper in the ISPRS Journal of Photogrammetry and Remote Sensing &amp;ndash; &lt;em&gt;&lt;a href="https://ual.sg/publication/2023-ijprs-knowledge-topology/"&gt;Knowledge and topology: A two layer spatially dependent graph neural networks to identify urban functions with time-series street view image&lt;/a&gt;&lt;/em&gt;.
Yan&amp;rsquo;s full list of publications is available &lt;a href="https://scholar.google.com.hk/citations?user=H8T2HtsAAAAJ" target="_blank" rel="noopener"&gt;here&lt;/a&gt;, while more information about his research is available on &lt;a href="https://sites.google.com/view/giserzhang" target="_blank" rel="noopener"&gt;his personal website&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Yan is now a Postdoc at The Chinese University of Hong Kong.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="maxim-khomiakov-technical-university-of-denmark"&gt;Maxim Khomiakov, Technical University of Denmark&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/maxim-phd_hu_67cff9e6bdca455c.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/maxim-phd_hu_58166b7bc7cb3855.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/maxim-phd_hu_29edda2a409e952d.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/maxim-phd_hu_67cff9e6bdca455c.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/maxim-khomiakov/"&gt;Maxim Khomiakov&lt;/a&gt; defended his PhD thesis at the Technical University of Denmark, bringing advancements in the use of deep learning in remote sensing.&lt;/p&gt;
&lt;p&gt;While with us, Maxim was an important contributor to our &lt;a href="https://ual.sg/project/global-streetscapes"&gt;Global Streetscapes project&lt;/a&gt;, which greatly benefited from his expertise and ideas.&lt;/p&gt;
&lt;p&gt;His publications are available &lt;a href="https://scholar.google.com.sg/citations?user=czbfDcwAAAAJ&amp;amp;hl=en&amp;amp;oi=ao" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.
He also has &lt;a href="https://www.maxims.dev" target="_blank" rel="noopener"&gt;a personal website&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Maxim is now working as an ML Research Engineer at &lt;a href="http://www.pihalf.com/" target="_blank" rel="noopener"&gt;pihalf&lt;/a&gt; and he is also a Research Affiliate at the Technical University of Denmark.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="junjie-luo-tianjin-university"&gt;Junjie Luo, Tianjin University&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/junjie-phd_hu_aacf6cafa58b81b9.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/junjie-phd_hu_95827d51400d09c0.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/junjie-phd_hu_7f829aa3212fb466.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/junjie-phd_hu_aacf6cafa58b81b9.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt; defended his PhD thesis &lt;em&gt;Visual Perception Analysis of Urban Riverscapes Based on a Digital Twin System&lt;/em&gt; at Tianjin University.&lt;/p&gt;
&lt;p&gt;Our research group learned a lot from Junjie&amp;rsquo;s expertise during his research visit.
We worked predominanlty on human perception studies based on novel datasets.
Multiple of these efforts have been published in leading journals.
For example, Landscape and Urban Planning published his work &lt;a href="https://ual.sg/publication/2022-land-semantic-riverscapes/"&gt;Semantic Riverscapes: Perception and evaluation of linear landscapes from oblique imagery using computer vision&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The full list of his publications is available &lt;a href="https://scholar.google.com/citations?hl=en&amp;amp;user=9DZiTEUAAAAJ" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Junjie is now Associate Professor at Zhejiang A&amp;amp;F University.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="rui-ma-city-university-of-hong-kong"&gt;Rui Ma, City University of Hong Kong&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/rui-phd_hu_f39a127107f4475e.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/rui-phd_hu_95b69928c9280fb9.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/rui-phd_hu_feabdad57cacaca3.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/rui-phd_hu_f39a127107f4475e.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/rui-ma/"&gt;Rui Ma&lt;/a&gt; was awarded a PhD in Architecture and Civil Engineering from the City University of Hong Kong for his thesis on urban building energy modeling.&lt;/p&gt;
&lt;p&gt;During his stay in our Lab, we worked on multiple research efforts, including advancing 3D building reconstruction using street view imagery.
The full list of his publications is available &lt;a href="https://scholar.google.com/citations?user=V_KsQSgAAAAJ&amp;amp;hl=zh-CN" target="_blank" rel="noopener"&gt;here&lt;/a&gt;, while more information about his research is available on &lt;a href="https://ruirzma.github.io" target="_blank" rel="noopener"&gt;his personal website&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Rui is currently a Postdoc at the same university.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="dayu-yu-wuhan-university"&gt;Dayu Yu, Wuhan University&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/dayu-phd_hu_97e83f9c3c5e1c27.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/dayu-phd_hu_e25820ed840aa334.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/dayu-phd_hu_d6040bf0e84cae8.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/dayu-phd_hu_97e83f9c3c5e1c27.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/dayu-yu/"&gt;Dayu Yu&lt;/a&gt; has obtained his PhD from Wuhan University, defending his thesis &lt;em&gt;Lightweight Generation and Service Methods for Photo-Realistic 3D Building Models&lt;/em&gt;, which he conducted at the School of Remote Sensing Information Engineering.&lt;/p&gt;
&lt;p&gt;We collaborated with Dayu on an effort to advance 3D spatial data infrastructures.
You can read more about his research on &lt;a href="https://dayuyu-3d.github.io" target="_blank" rel="noopener"&gt;his personal website&lt;/a&gt;, while the list of his publications is available &lt;a href="https://scholar.google.com.hk/citations?hl=zh-CN&amp;amp;user=C3Pkj1sAAAAJ" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Dayu has transitioned to an Assistant Professor role at Nanjing Normal University.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;In addition, it is our pleasure to use this opportunity to feature also &lt;a href="https://ual.sg/author/chenyi-cai/"&gt;Chenyi Cai&lt;/a&gt;, a Postdoctoral Research Fellow at Future Cities Lab Global (Singapore-ETH Centre) in our project &lt;a href="https://fcl.ethz.ch/research/integration-and-strategies/semantic-urban-elements.html" target="_blank" rel="noopener"&gt;Semantic Urban Elements&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="chenyi-cai-southeast-university"&gt;Chenyi Cai, Southeast University&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/chenyi-phd_hu_8be09872e247e662.webp 400w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/chenyi-phd_hu_de0bd5f59ccf7836.webp 760w,
/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/chenyi-phd_hu_a92357f7f868f52e.webp 1200w"
src="https://ual.sg/post/2024/08/12/nine-phd-graduations-of-our-recent-visiting-scholars/chenyi-phd_hu_8be09872e247e662.webp"
width="760"
height="427"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ch.linkedin.com/in/chenyi-cai-680943201" target="_blank" rel="noopener"&gt;Chenyi Cai&lt;/a&gt; has obtained her PhD at the Southeast University in Nanjing.
During her PhD she was also a visiting researcher at the Institute of Technology in Architecture, ETH Zurich.&lt;/p&gt;
&lt;p&gt;Chenyi&amp;rsquo;s publications are available &lt;a href="https://scholar.google.com/citations?hl=en&amp;amp;user=DV1nKooAAAAJ&amp;amp;view_op=list_works&amp;amp;sortby=pubdate" target="_blank" rel="noopener"&gt;here&lt;/a&gt;, while her profile is available on the &lt;a href="https://fcl.ethz.ch/people/researchers/cai-chenyi.html" target="_blank" rel="noopener"&gt;SEC website&lt;/a&gt;.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Congratulations to everyone, well done! 🎉👏&lt;/p&gt;</description></item><item><title>Happy 20th birthday OpenStreetMap!</title><link>https://ual.sg/post/2024/08/11/happy-20th-birthday-openstreetmap/</link><pubDate>Sun, 11 Aug 2024 16:33:03 +0800</pubDate><guid>https://ual.sg/post/2024/08/11/happy-20th-birthday-openstreetmap/</guid><description>&lt;p&gt;
&lt;figure id="figure-singapore-in-openstreetmap-map-as-of-2024-08-11-c-openstreetmap-contributors"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Singapore in OpenStreetMap (map as of 2024-08-11). (c) OpenStreetMap contributors." srcset="
/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-sg_hu_37962969cd395961.webp 400w,
/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-sg_hu_620ec52ae72a1f08.webp 760w,
/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-sg_hu_eabfde4566f0317b.webp 1200w"
src="https://ual.sg/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-sg_hu_37962969cd395961.webp"
width="760"
height="462"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Singapore in OpenStreetMap (map as of 2024-08-11). (c) OpenStreetMap contributors.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.openstreetmap.org/" target="_blank" rel="noopener"&gt;OpenStreetMap (OSM)&lt;/a&gt; is turning 20!&lt;/p&gt;
&lt;p&gt;For those who are not familiar, Wikipedia &lt;a href="https://en.wikipedia.org/wiki/OpenStreetMap" target="_blank" rel="noopener"&gt;defines&lt;/a&gt; it well:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;OpenStreetMap (OSM) is a free, open geographic database updated and maintained by a community of volunteers via open collaboration.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Our group and discipline have greatly benefited from this amazing project and its contributors. Over the past two decades, OpenStreetMap has been a game changer, enabling communities worldwide, supporting research and products, and spurring a vibrant research ecosystem.&lt;/p&gt;
&lt;p&gt;Thanks to OSM, we have access to a variety of geospatial information from around the world previously not freely available, and we are especially proud that Singapore is so well mapped in the platform. 🇸🇬🗺️&lt;/p&gt;
&lt;p&gt;While OSM data is regularly part of our projects, we have also been conducting research on its data, e.g. data quality:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Chow YS, Lee K (2023): Quality of crowdsourced geospatial building information: A global assessment of OpenStreetMap attributes. &lt;em&gt;Building and Environment&lt;/em&gt; 237: 110295. &lt;a href="https://doi.org/10.1016/j.buildenv.2023.110295" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2023.110295&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-bae-osm-qa/2023-bae-osm-qa.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;We look forward to seeing where the next 20 years will take the project!&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-our-very-well-mapped-nus-kent-ridge-campus-in-openstreetmap-map-as-of-2024-08-11-c-openstreetmap-contributors"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Our very well mapped NUS Kent Ridge campus in OpenStreetMap (map as of 2024-08-11). (c) OpenStreetMap contributors." srcset="
/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-nus_hu_f8654b932c12cc6b.webp 400w,
/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-nus_hu_ea1d07d7bf08830.webp 760w,
/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-nus_hu_7965cebf2767fd2.webp 1200w"
src="https://ual.sg/post/2024/08/11/happy-20th-birthday-openstreetmap/osm-nus_hu_f8654b932c12cc6b.webp"
width="760"
height="462"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Our very well mapped NUS Kent Ridge campus in OpenStreetMap (map as of 2024-08-11). (c) OpenStreetMap contributors.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: High-resolution mapping of urban Aedes aegypti immature abundance through breeding site detection based on satellite and street view imagery</title><link>https://ual.sg/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/</link><pubDate>Wed, 07 Aug 2024 16:01:09 +0800</pubDate><guid>https://ual.sg/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Knoblauch S, Su Yin M, Chatrinan K, de Aragão Rocha AA, Haddawy P, Biljecki F, Lautenbach S, Resch B, Arifi D, Jänisch T, Morales I, Zipf A (2024): High-resolution mapping of urban Aedes aegypti immature abundance through breeding site detection based on satellite and street view imagery. Scientific Reports 14(1): 18227. &lt;a href="https://doi.org/10.1038/s41598-024-67914-w" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41598-024-67914-w&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-sr-aa-mapping/2024-sr-aa-mapping.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://www.geog.uni-heidelberg.de/gis/knoblauch.html" target="_blank" rel="noopener"&gt;Steffen Knoblauch&lt;/a&gt; from the &lt;a href="https://www.geog.uni-heidelberg.de/gis/index_en.html" target="_blank" rel="noopener"&gt;GIScience Research Group&lt;/a&gt; at Heidelberg University in Germany.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://doi.org/10.1038/s41598-024-67914-w" target="_blank" rel="noopener"&gt;available open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/1_hu_28478617dedf1323.webp 400w,
/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/1_hu_61534999b8df7088.webp 760w,
/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/1_hu_2ac81d3bfffea0cf.webp 1200w"
src="https://ual.sg/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/1_hu_28478617dedf1323.webp"
width="760"
height="671"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Identification of Aedes aegypti breeding hotspots is essential for the implementation of targeted vector control strategies and thus the prevention of several mosquito-borne diseases worldwide. Training computer vision models on satellite and street view imagery in the municipality of Rio de Janeiro, we analyzed the correlation between the density of common breeding grounds and Aedes aegypti infestation measured by ovitraps on a monthly basis between 2019 and 2022. Our findings emphasized the significance (p ≤ 0.05) of micro-habitat proxies generated through object detection, allowing to explain high spatial variance in urban abundance of Aedes aegypti immatures. Water tanks, non-mounted car tires, plastic bags, potted plants, and storm drains positively correlated with Aedes aegypti egg and larva counts considering a 1000 m mosquito flight range buffer around 2700 ovitrap locations, while dumpsters, small trash bins, and large trash bins exhibited a negative association. This complementary application of satellite and street view imagery opens the pathway for high-resolution interpolation of entomological surveillance data and has the potential to optimize vector control strategies. Consequently it supports the mitigation of emerging infectious diseases transmitted by Aedes aegypti, such as dengue, chikungunya, and Zika, which cause thousands of deaths each year.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-sr-aa-mapping/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-sr-aa-mapping/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/page-one_hu_239a197693053b77.webp 400w,
/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/page-one_hu_17a9befc3027d50c.webp 760w,
/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/page-one_hu_cd201b8321c6dc4c.webp 1200w"
src="https://ual.sg/post/2024/08/07/new-paper-high-resolution-mapping-of-urban-aedes-aegypti-immature-abundance-through-breeding-site-detection-based-on-satellite-and-street-view-imagery/page-one_hu_239a197693053b77.webp"
width="576"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_sr_aa_mapping&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Knoblauch, Steffen and Su Yin, Myat and Chatrinan, Krittin and de Arag{\~a}o Rocha, Antonio Augusto and Haddawy, Peter and Biljecki, Filip and Lautenbach, Sven and Resch, Bernd and Arifi, Dorian and J{\&amp;#34;a}nisch, Thomas and Morales, Ivonne and Zipf, Alexander}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s41598-024-67914-w}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Scientific Reports}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{High-resolution mapping of urban Aedes aegypti immature abundance through breeding site detection based on satellite and street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{14}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{18227}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Evaluating human perception of building exteriors using street view imagery</title><link>https://ual.sg/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/</link><pubDate>Tue, 06 Aug 2024 09:13:22 +0800</pubDate><guid>https://ual.sg/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liang X, Chang JH, Gao S, Zhao T, Biljecki F (2024): Evaluating human perception of building exteriors using street view imagery. Building and Environment, 263: 111875. &lt;a href="https://doi.org/10.1016/j.buildenv.2024.111875" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2024.111875&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-bae-building/2024-bae-building.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/1_hu_4f3d51c0cce31c0d.webp 400w,
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/1_hu_e092f4b09d271eb4.webp 760w,
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/1_hu_46f8b1e535c1b837.webp 1200w"
src="https://ual.sg/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/1_hu_4f3d51c0cce31c0d.webp"
width="760"
height="363"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1jXKy1HudNFfq2" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-09-21.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A deep learning approach is developed to evaluate building exteriors in three cities.&lt;/li&gt;
&lt;li&gt;Spatial patterns of architectural design are identified from street view imagery.&lt;/li&gt;
&lt;li&gt;Building characteristics in cities are analysed using objective building attributes.&lt;/li&gt;
&lt;li&gt;The influence of building perceptions on overall streetscape perceptions is quantified.&lt;/li&gt;
&lt;li&gt;The historical and complex ambience of buildings enhance streetscape perception quality.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/2_hu_cf5d10e3a5419e6.webp 400w,
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/2_hu_4ad0185fb640b723.webp 760w,
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/2_hu_fe391ad5201a871d.webp 1200w"
src="https://ual.sg/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/2_hu_cf5d10e3a5419e6.webp"
width="760"
height="375"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Building appearances profoundly shape the urban visual landscape, influencing city images and the quality of urban life. Traditional methods for evaluating the perceptual and aesthetic qualities of building facades are often limited in scope. Despite recent studies that have sought to understand human perception of urban streetscapes, our grasp of how individuals perceive building exteriors on a broader scale and the subsequent impact on holistic street experiences, remains largely unexplored. In this study, we integrate a traditional survey-based evaluation framework with machine learning techniques to analyse human perception of over 250,000 building images from Singapore, San Francisco, and Amsterdam. Specifically, deep learning models trained on crowdsourced ratings of 1,200 building images across six perceptual attributes — complex, original, ordered, pleasing, boring, and style — achieve over 72% accuracy. This novel approach enables adaptive and comparative analyses of building appearances across regions, revealing spatial patterns in the perception of architectural exteriors and their relationships with functions, age, and location. Moreover, by applying propensity score matching to match images based on their features, we mark one of the first efforts to investigate the perceptual impacts of buildings on streetscape perceptions. The results show that streetscapes with higher levels of complex, pleasing, and historical ambience from buildings elicit more positive perceptions, whereas modern and monotonous exteriors often evoke holistic feelings of being “boring” and “depressing”. These findings offer architects and city planners valuable insights into public sentiment towards city-level building exteriors and their influence on urban identity and perception.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-bae-building/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-bae-building/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/page-one_hu_87a8eb9f5c59ee57.webp 400w,
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/page-one_hu_5601a4f04f27957.webp 760w,
/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/page-one_hu_9ad7460d5b861c6b.webp 1200w"
src="https://ual.sg/post/2024/08/06/new-paper-evaluating-human-perception-of-building-exteriors-using-street-view-imagery/page-one_hu_87a8eb9f5c59ee57.webp"
width="561"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_bae_building&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liang, Xiucheng and Chang, Jiat Hwee and Gao, Song and Zhao, Tianhong and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2024.111875}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{111875}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Evaluating human perception of building exteriors using street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{263}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Introducing Global Streetscapes</title><link>https://ual.sg/post/2024/07/18/introducing-global-streetscapes/</link><pubDate>Thu, 18 Jul 2024 12:01:37 +0800</pubDate><guid>https://ual.sg/post/2024/07/18/introducing-global-streetscapes/</guid><description>&lt;p&gt;We are excited to announce our project &lt;a href="https://ual.sg/project/global-streetscapes/"&gt;&lt;em&gt;Global Streetscapes&lt;/em&gt;&lt;/a&gt;!&lt;/p&gt;
&lt;p&gt;It is a big open large-scale labelled street-level imagery dataset addressing various challenges on using street view imagery in urban sciences.
A comprehensive paper about the project is published as a namesake &lt;a href="https://doi.org/10.1016/j.isprsjprs.2024.06.023" target="_blank" rel="noopener"&gt;article&lt;/a&gt; in the &lt;em&gt;ISPRS Journal of Photogrammetry and Remote Sensing&lt;/em&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Hou Y, Quintana M, Khomiakov M, Yap W, Ouyang J, Ito K, Wang Z, Zhao T, Biljecki F (2024): Global Streetscapes &amp;ndash; A comprehensive dataset of 10 million street-level images across 688 cities for urban science and analytics. &lt;em&gt;ISPRS Journal of Photogrammetry and Remote Sensing&lt;/em&gt; 215: 216-238.
&lt;a href="https://doi.org/10.1016/j.isprsjprs.2024.06.023" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt;10.1016/j.isprsjprs.2024.06.023&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-global-streetscapes/2024-global-streetscapes.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The project was led by &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;, and it was carried out in a large collaboration within our research group.&lt;/p&gt;
&lt;p&gt;Global Streetscapes is a worldwide dataset of 10 million crowdsourced SVIs sampled from Mapillary and KartaView, covering 688 cities around the world, which account for about 10% of the world&amp;rsquo;s population, enriched with more than 300 attributes and has wide geographical, environmental, and temporal diversity.
In addition, the project is supported with open-source code and documentation.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/18/introducing-global-streetscapes/coverage_hu_bf45eef4b94288c0.webp 400w,
/post/2024/07/18/introducing-global-streetscapes/coverage_hu_d06690ab99b9dd88.webp 760w,
/post/2024/07/18/introducing-global-streetscapes/coverage_hu_44a8f374b00a0a85.webp 1200w"
src="https://ual.sg/post/2024/07/18/introducing-global-streetscapes/coverage_hu_bf45eef4b94288c0.webp"
width="760"
height="342"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/18/introducing-global-streetscapes/example_hu_e67d66a41c34183a.webp 400w,
/post/2024/07/18/introducing-global-streetscapes/example_hu_d505ce0e8766580b.webp 760w,
/post/2024/07/18/introducing-global-streetscapes/example_hu_54b77dcad5869c80.webp 1200w"
src="https://ual.sg/post/2024/07/18/introducing-global-streetscapes/example_hu_e67d66a41c34183a.webp"
width="760"
height="664"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Check the &lt;a href="https://ual.sg/project/global-streetscapes/"&gt;website of the project&lt;/a&gt; for more information and links to the dataset, code, and other products, while the &lt;a href="https://doi.org/10.1016/j.isprsjprs.2024.06.023" target="_blank" rel="noopener"&gt;paper&lt;/a&gt; details the motivation, methodology, examples of results, and provides use cases.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1jRHE3I9x1qnmq" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-09-04.&lt;/p&gt;
&lt;h2 id="highlights"&gt;Highlights&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Largest labelled dataset, with 346 attributes that characterise street photos.&lt;/li&gt;
&lt;li&gt;Baseline models and ground truth labels for benchmarking computer vision models.&lt;/li&gt;
&lt;li&gt;Reproducible framework to sample and enrich SVIs from cities all around the world.&lt;/li&gt;
&lt;li&gt;In-depth discussion of how the dataset could drive novel research questions.&lt;/li&gt;
&lt;li&gt;Taking forward the work of Mapillary and KartaView, and their contributors.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="abstract"&gt;Abstract&lt;/h2&gt;
&lt;blockquote&gt;
&lt;p&gt;Street view imagery (SVI) is instrumental for sensing urban environments, benefitting numerous domains such as urban morphology, health, greenery, and accessibility. Billions of images worldwide have been made available by commercial services such as Google Street View and crowdsourcing services such as Mapillary and KartaView where anyone from anywhere can upload imagery while moving. However, while the data tend to be plentiful, have high coverage and quality, and are used to derive rich insights, they remain simple and limited in metadata as characteristics such as weather, quality, and lighting conditions remain unknown, making it difficult to evaluate the suitability of the images for specific analyses. We introduce Global Streetscapes — a dataset of 10 million crowdsourced and free-to-use SVIs sampled from 688 cities across 210 countries and territories, enriched with more than 300 camera, geographical, temporal, contextual, semantic, and perceptual attributes. The cities included are well balanced and diverse, and are home to about 10% of the world’s population. Deep learning models are trained on a subset of manually labelled images for eight visual-contextual attributes pertaining to the usability of SVI — panoramic status, lighting condition, view direction, weather, platform, quality, presence of glare and reflections, achieving accuracy ranging from 68.3% to 99.9%, and used to automatically label the entire dataset. Thanks to its scale and pre-computed standard semantic information, the data can be readily used to benefit existing use cases and to unlock new applications, including multi-city comparative studies and longitudinal analyses, as affirmed by a couple of use cases in the paper. Moreover, the automated processes and open-source code facilitate the expansion and updates of the dataset and encourage users to create their own datasets. With the rich manual annotations, some of which are provided for the first time, and diverse conditions present in the images, the dataset also facilitates assessing the heterogeneous properties of crowdsourced SVIs and provides a benchmark for evaluating future computer vision models. We make the Global Streetscapes dataset and the code to reproduce and use it publicly available in &lt;a href="https://github.com/ualsg/global-streetscapes" target="_blank" rel="noopener"&gt;https://github.com/ualsg/global-streetscapes&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-global-streetscapes/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-global-streetscapes/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/18/introducing-global-streetscapes/page-one_hu_15f889437c5d56b3.webp 400w,
/post/2024/07/18/introducing-global-streetscapes/page-one_hu_f99ccdf70cae5053.webp 760w,
/post/2024/07/18/introducing-global-streetscapes/page-one_hu_6f8bca5586dd92c.webp 1200w"
src="https://ual.sg/post/2024/07/18/introducing-global-streetscapes/page-one_hu_15f889437c5d56b3.webp"
width="566"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_global_streetscapes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Hou, Yujun and Quintana, Matias and Khomiakov, Maxim and Yap, Winston and Ouyang, Jiani and Ito, Koichi and Wang, Zeyu and Zhao, Tianhong and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.isprsjprs.2024.06.023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Journal of Photogrammetry and Remote Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{216-238}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Global Streetscapes -- A comprehensive dataset of 10 million street-level images across 688 cities for urban science and analytics}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{215}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Exciting guest lectures at our group by four professors during July</title><link>https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/</link><pubDate>Wed, 17 Jul 2024 13:39:19 +0800</pubDate><guid>https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/</guid><description>&lt;p&gt;We are having an exciting, vibrant, and busy July!&lt;/p&gt;
&lt;p&gt;This month, we hosted four faculty from overseas for insightful department lectures and group discussion sessions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.itb.ac.id/staf/profil/adiwan-fahlan-aritenang" target="_blank" rel="noopener"&gt;Adiwan Aritenang&lt;/a&gt;, Bandung Institute of Technology 🇮🇩&lt;/li&gt;
&lt;li&gt;&lt;a href="https://sites.rutgers.edu/thakuriah/" target="_blank" rel="noopener"&gt;Vonu Thakuriah&lt;/a&gt;, Rutgers University-New Brunswick 🇺🇸&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.atepoorthuis.com" target="_blank" rel="noopener"&gt;Ate Poorthuis&lt;/a&gt;, KU Leuven 🇧🇪&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dusp.mit.edu/people/fabio-duarte" target="_blank" rel="noopener"&gt;Fábio Duarte&lt;/a&gt;, Massachusetts Institute of Technology 🇺🇸&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The topics of the lectures are listed below, while the abstracts and more information are available on &lt;a href="https://ual.sg/seminars"&gt;our seminars website&lt;/a&gt;.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;User-Generated Data: Alternative Data Source for Urban Planning in Indonesia&lt;/li&gt;
&lt;li&gt;Ethics and Responsible Innovation in Data-Intensive Smart Cities and Urban Mobility Management&lt;/li&gt;
&lt;li&gt;Twenty years of using crowd-sourced geodata to analyze urban phenomena – where do we go next?&lt;/li&gt;
&lt;li&gt;What happens with big data approaches to urban science when data don&amp;rsquo;t exist?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Big thanks to our guests for their time and the wonderful talks!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/1_hu_f24542c44e9e9032.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/1_hu_238957fa936e6e54.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/1_hu_653497b0c990cd0c.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/1_hu_f24542c44e9e9032.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/2_hu_8315bc98546193df.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/2_hu_38c9af7cb0204674.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/2_hu_25d01c3ac581262e.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/2_hu_8315bc98546193df.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/3_hu_e2b3ad73742772cc.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/3_hu_30dcfe3af44ea2c5.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/3_hu_3f3d055932b9a780.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/3_hu_e2b3ad73742772cc.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/4_hu_c0600749028d61b8.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/4_hu_422ceaa9d97c68e6.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/4_hu_bc6f1dd07c9f7e93.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/4_hu_c0600749028d61b8.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/5_hu_2c47e1c48da06fe0.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/5_hu_60a8469f8913bbf0.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/5_hu_42db8134e37381a2.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/5_hu_2c47e1c48da06fe0.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/6_hu_3ffd766c5967e766.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/6_hu_180f4e930d1346d2.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/6_hu_f78be2d32dd17224.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/6_hu_3ffd766c5967e766.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/7_hu_1eb1acb5811ba9a1.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/7_hu_ecc9e8243f2e9cec.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/7_hu_12af8180a1ba3214.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/7_hu_1eb1acb5811ba9a1.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/8_hu_6f77cacc5419e451.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/8_hu_a979e33bd7318c70.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/8_hu_1c62c27f18b08ff1.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/8_hu_6f77cacc5419e451.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/9_hu_f94ba9d68efb21ea.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/9_hu_e2d72680adb45058.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/9_hu_60431612b7c4cfd2.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/9_hu_f94ba9d68efb21ea.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/10_hu_9e210a6b3893e3c1.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/10_hu_f0b3f620552565cb.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/10_hu_a93d9072c0dabea0.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/10_hu_9e210a6b3893e3c1.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/11_hu_1cf3f3ec2550d055.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/11_hu_ed468d0d2ca4210d.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/11_hu_caf73b2a4b3c656d.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/11_hu_1cf3f3ec2550d055.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/12_hu_6a58d17d07abe4b4.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/12_hu_8422a2ed12d61988.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/12_hu_8469223e22a0debd.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/12_hu_6a58d17d07abe4b4.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/13_hu_2aa7ac66731a87cc.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/13_hu_206f83989f6d1893.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/13_hu_4616d2ac19259d77.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/13_hu_2aa7ac66731a87cc.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/14_hu_d5e0f6a0ab1af5c9.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/14_hu_76198b7466744af4.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/14_hu_6c84cdb6b064627.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/14_hu_d5e0f6a0ab1af5c9.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/15_hu_b78e86f273c42b76.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/15_hu_5fb37111774b0e52.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/15_hu_cc90f84750c18914.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/15_hu_b78e86f273c42b76.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/16_hu_c3b07e70e820b117.webp 400w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/16_hu_7442600489115d.webp 760w,
/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/16_hu_c0ec3a43708bee70.webp 1200w"
src="https://ual.sg/post/2024/07/17/exciting-guest-lectures-at-our-group-by-four-professors-during-july/16_hu_c3b07e70e820b117.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Read more about our seminars on its &lt;a href="https://ual.sg/seminars"&gt;website&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Our participation at the 3D GeoInfo 2024 conference</title><link>https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/</link><pubDate>Sun, 07 Jul 2024 20:48:19 +0800</pubDate><guid>https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/</guid><description>&lt;p&gt;The &lt;a href="https://3dgeoinfoeg-ice.webs.uvigo.es" target="_blank" rel="noopener"&gt;19th International 3D GeoInfo Conference&lt;/a&gt; took place in Vigo, Galicia, Spain on 1-3 July 2024.
It was hosted by &lt;a href="https://cintecx.uvigo.es/es/teacher/lucia-diaz-vilarino/" target="_blank" rel="noopener"&gt;Lucía Díaz Vilariño&lt;/a&gt; and colleagues at the &lt;a href="https://www.uvigo.gal/en" target="_blank" rel="noopener"&gt;University of Vigo&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/0_hu_74ccbb4ba8565221.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/0_hu_d11ae0d9d92c1dc5.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/0_hu_c00aceab3938d05a.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/0_hu_74ccbb4ba8565221.webp"
width="760"
height="238"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;From the &lt;a href="https://3dgeoinfoeg-ice.webs.uvigo.es" target="_blank" rel="noopener"&gt;website&lt;/a&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The 3D GeoInfo Conference aims to bring together international researchers, practitioners, and professionals from academia, industry, and government to discuss the latest advances in 3D geoinformation science and technology. This annual event offers a multidisciplinary and inter-sectorial forum in the fields of 3D/4D data collection, management, data quality, analysis, advanced modelling, and visualization, with a strong focus on cutting-edge research, standardisation, technical, implementation and application issues across different disciplines.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This year, the conference was collocated with the &lt;a href="https://3dgeoinfoeg-ice.webs.uvigo.es/eg-ice" target="_blank" rel="noopener"&gt;31st EG-ICE International Workshop on Intelligent Computing in Engineering&lt;/a&gt; , forming the Joint 3D GeoInfo conference and EG-ICE workshop 2024.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/1_hu_e29b77a00ec14716.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/1_hu_1760ff33f85d9925.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/1_hu_6a8e7a2e4dccf93e.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/1_hu_e29b77a00ec14716.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;3D GeoInfo is the leading conference in this domain, and it started in 2006 in Kuala Lumpur, Malaysia under the auspices of Professor Alias Abdul Rahman and his research group.
We organised its &lt;a href="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/"&gt;2019 instance in Singapore&lt;/a&gt;, together with the Singapore Land Authority.&lt;/p&gt;
&lt;p&gt;The conference featured many interesting talks on the most recent developments in 3D GIS and urban digital twins.&lt;/p&gt;
&lt;p&gt;Our research group was represented by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, and also our colleague &lt;a href="https://cde.nus.edu.sg/arch/staffs/rudi-stouffs-dr/" target="_blank" rel="noopener"&gt;Rudi Stouffs&lt;/a&gt; from the &lt;a href="https://cde.nus.edu.sg/arch/" target="_blank" rel="noopener"&gt;NUS Department of Architecture&lt;/a&gt; was there.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/2_hu_6486d499ddc81b82.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/2_hu_9966c580d3077454.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/2_hu_9c1fdc2cd7793e26.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/2_hu_6486d499ddc81b82.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Three papers of ours were published and presented at the conference, and they include our lab members &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;, &lt;a href="https://ual.sg/author/marcel-ignatius/"&gt;Marcel Ignatius&lt;/a&gt;, and &lt;a href="https://ual.sg/author/kunihiko-fujiwara/"&gt;Kunihiko Fujiwara&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The conference papers have been published in two volumes in &lt;a href="https://isprs-annals.copernicus.org/articles/X-4-W5-2024/" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt; and &lt;a href="https://isprs-archives.copernicus.org/articles/XLVIII-4-W11-2024/" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt;, edited by Lucía Díaz-Vilariño and Jesús Balado.
Here is the list of our papers with links to access them:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Liang X, Biljecki F (2024): Integrating human perception in 3D city models and urban digital twins. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, X-4/W5-2024: 211-218. &lt;a href="https://doi.org/10.5194/isprs-annals-x-4-w5-2024-211-2024" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-x-4-w5-2024-211-2024&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-3-dgeoinfo-perception-dt/2024-3-dgeoinfo-perception-dt.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Ignatius M, Lim J, Gottkehaskamp B, Fujiwara K, Miller C, Biljecki F (2024): Digital Twin and Wearables Unveiling Pedestrian Comfort Dynamics and Walkability in Cities. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, X-4/W5-2024: 195-202. &lt;a href="https://doi.org/10.5194/isprs-annals-x-4-w5-2024-195-2024" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-x-4-w5-2024-195-2024&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-3-dgeoinfo-thermal-walk/2024-3-dgeoinfo-thermal-walk.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Lim J, Biljecki F, Stouffs R (2024): Integration of Movement Data into 3D GIS. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, X-4/W5-2024: 219-227. &lt;a href="https://doi.org/10.5194/isprs-annals-x-4-w5-2024-219-2024" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-x-4-w5-2024-219-2024&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-3-dgeoinfo-movementdata/2024-3-dgeoinfo-movementdata.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/3_hu_fc65585207e2ef93.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/3_hu_56932d0376c90c89.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/3_hu_6333c7c2e3b913b8.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/3_hu_fc65585207e2ef93.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/4_hu_f7382592904bae0.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/4_hu_350f6382de906557.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/4_hu_cbaa8d763a55bf67.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/4_hu_f7382592904bae0.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/5_hu_9175dbfbd05d89db.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/5_hu_33b146e6d6157217.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/5_hu_14efea674ee1ed59.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/5_hu_9175dbfbd05d89db.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;And here are some images that depict the research.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/workflow_hu_31cab5e3d7d268ff.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/workflow_hu_22051e0e4593b79.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/workflow_hu_292c0eda18d6f2c0.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/workflow_hu_31cab5e3d7d268ff.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/visualisation_hu_aa50c8355876a350.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/visualisation_hu_674579340c681697.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/visualisation_hu_5cd35b1fbef19d24.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/visualisation_hu_aa50c8355876a350.webp"
width="760"
height="710"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/DT_edit_hu_34f745145f0fa85f.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/DT_edit_hu_409f4adeacd78c72.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/DT_edit_hu_5d91884858eb816c.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/DT_edit_hu_34f745145f0fa85f.webp"
width="760"
height="380"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The best paper award went to &lt;a href="https://3d.bk.tudelft.nl/weixiao/" target="_blank" rel="noopener"&gt;Weixiao Gao&lt;/a&gt;, &lt;a href="https://scholar.google.nl/citations?user=DQyb2G8AAAAJ&amp;amp;hl=nl" target="_blank" rel="noopener"&gt;Ravi Peters&lt;/a&gt;, &lt;a href="http://3d.bk.tudelft.nl/hledoux" target="_blank" rel="noopener"&gt;Hugo Ledoux&lt;/a&gt;, and &lt;a href="http://3d.bk.tudelft.nl/jstoter" target="_blank" rel="noopener"&gt;Jantien Stoter&lt;/a&gt; from &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation at TU Delft&lt;/a&gt; for their article &lt;em&gt;&lt;a href="https://doi.org/10.5194/isprs-annals-X-4-W5-2024-171-2024" target="_blank" rel="noopener"&gt;Filling holes in LoD2 building models&lt;/a&gt;&lt;/em&gt;.
Congratulations! 👏🎉&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/6_hu_95a35e4a2ca084cd.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/6_hu_6a0c68ea98e1dd16.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/6_hu_57da2c8d0a6b9c13.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/6_hu_95a35e4a2ca084cd.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;It was a great conference.
We highly appreciate the organisation by Lucía and others at UVigo.&lt;/p&gt;
&lt;p&gt;The next instance of the conference, in 2025, will be in Japan.
It will be collocated with the Smart Data and Smart Cities (SDSC) conference in the first week of September 2025, at the University of Tokyo &amp;ndash; Kashiwa campus, organised by the &lt;a href="http://sekilab.iis.u-tokyo.ac.jp" target="_blank" rel="noopener"&gt;lab of Professor Yoshihide Sekimoto&lt;/a&gt; at the &lt;a href="https://www.csis.u-tokyo.ac.jp/en/" target="_blank" rel="noopener"&gt;Center for Spatial Information Science&lt;/a&gt;, and the &lt;a href="https://www.mlit.go.jp/plateau/" target="_blank" rel="noopener"&gt;PLATEAU team&lt;/a&gt; at the Ministry of Land, Infrastructure, Transport and Tourism (MLIT).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/7_hu_2567159ea36cc159.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/7_hu_b5db71fd9484b457.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/7_hu_cb9043390c1230ca.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/7_hu_2567159ea36cc159.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Our Lab remains committed to contributing to this vibrant community, and we very much look forward to the next instance of the conference.&lt;/p&gt;
&lt;p&gt;See you at 3D GeoInfo 2025 at UTokyo Kashiwa! &amp;#x1f1ef;&amp;#x1f1f5;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/8_hu_ad19f1da964f3e3e.webp 400w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/8_hu_7679991095ac479.webp 760w,
/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/8_hu_34e0f77e1bd4b3a2.webp 1200w"
src="https://ual.sg/post/2024/07/07/our-participation-at-the-3d-geoinfo-2024-conference/8_hu_ad19f1da964f3e3e.webp"
width="760"
height="255"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;BibTeX citations of our three papers:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_3dgeoinfo_perception_dt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Liang, Xiucheng and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-x-4-w5-2024-211-2024}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{211-218}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Integrating human perception in 3D city models and urban digital twins}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{X-4/W5-2024}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_3dgeoinfo_thermal_walk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ignatius, Marcel and Lim, Joie and Gottkehaskamp, Ben and Fujiwara, Kunihiko and Miller, Clayton and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-x-4-w5-2024-195-2024}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{195-202}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Digital Twin and Wearables Unveiling Pedestrian Comfort Dynamics and Walkability in Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{X-4/W5-2024}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_3dgeoinfo_movementdata&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lim, Joie and Biljecki, Filip and Stouffs, Rudi}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-x-4-w5-2024-219-2024}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{219-227}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Integration of Movement Data into 3D GIS}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{X-4/W5-2024}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Decarbonizing megacities: A spatiotemporal analysis considering inter-city travel and the 15-minute city concept</title><link>https://ual.sg/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/</link><pubDate>Thu, 04 Jul 2024 22:01:09 +0800</pubDate><guid>https://ual.sg/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen W, Tan Z, Wu Y, Biljecki F, Liao S, Zhou Q, Li H, Zheng Y, Gao F (2024): Decarbonizing megacities: A spatiotemporal analysis considering inter-city travel and the 15-minute city concept. &lt;em&gt;Cities&lt;/em&gt; 262: 111741. &lt;a href="https://doi.org/10.1016/j.cities.2024.105252" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2024.105252&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-cities-decarbonizing/2024-cities-decarbonizing.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/wangyang-chen/"&gt;Wangyang Chen&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/c/1jMpvy5jOuwSz" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-08-22.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/1_hu_4205691cbb32b6b2.webp 400w,
/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/1_hu_605677e88079ecf0.webp 760w,
/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/1_hu_a6fc00a3eb1f5d78.webp 1200w"
src="https://ual.sg/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/1_hu_4205691cbb32b6b2.webp"
width="760"
height="723"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Diversified spatiotemporal patterns of road CO2 emissions in Guangzhou were revealed.&lt;/li&gt;
&lt;li&gt;Novel methods were proposed to compare emission disparities between intra-city and inter-city trips.&lt;/li&gt;
&lt;li&gt;Decarbonization potential of the 15-minute city concept was quantified in a megacity context.&lt;/li&gt;
&lt;li&gt;Efficient paradigms for promoting low-carbon 15-minute city in megacities were proposed.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Megacities are major contributors to global road CO2 emissions, highlighting their pivotal role in achieving low-carbon development. However, comprehensive studies on emission patterns and decarbonization strategies in these metropolitan areas remain limited. This study presents a novel and portable big data-based workflow for megacities to reveal their spatiotemporal dynamics of road CO2 emissions and quantify decarbonization potentials associated with inter-city travel and the 15-minute city concept. We take Guangzhou City (China) as a case study. Our results reveal that primary purpose trips produce 17% more CO2 emissions than secondary trips on average. Inter-city trips account for 36.3% of the total emissions in the city, and those for primary purposes exhibit closer spatial distributions with intra-city trips. While providing more 15-minute-walk POIs exhibits a marginally diminishing effect on reducing trip average emissions, comprehensive implementation of the 15-minute city concept in Guangzhou can reduce up to 56.3% of the total emissions from non-home-related passenger trips, with variations observed across different trip purposes (40%–70%). A significant “head effect” of decarbonization potential across communities exists for all trip purposes. Our study highlights the environmental limitations of monocentric urban planning models in megacities and contributes valuable insights for crafting effective strategies for sustainable urban development.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-cities-decarbonizing/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-cities-decarbonizing/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/page-one_hu_c82d882ffe77002d.webp 400w,
/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/page-one_hu_d84fc9c325730357.webp 760w,
/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/page-one_hu_2d7c1c08cc83ee30.webp 1200w"
src="https://ual.sg/post/2024/07/04/new-paper-decarbonizing-megacities-a-spatiotemporal-analysis-considering-inter-city-travel-and-the-15-minute-city-concept/page-one_hu_c82d882ffe77002d.webp"
width="590"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_cities_decarbonizing&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Chen, Wangyang and Tan, Ziyi and Wu, Yaxin and Biljecki, Filip and Liao, Shunyi and Zhou, Qingya and Li, Hongbao and Zheng, Yuming and Gao, Feng}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2024.105252}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105252}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Decarbonizing megacities: A spatiotemporal analysis considering inter-city travel and the 15-minute city concept}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{152}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Techniques and Tools for Integrating Building Material Stock Analysis and Life Cycle Assessment at the Urban Scale</title><link>https://ual.sg/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/</link><pubDate>Wed, 03 Jul 2024 14:01:09 +0800</pubDate><guid>https://ual.sg/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Pei W, Biljecki F, Stouffs R (2024): Techniques and tools for integrating building material stock analysis and life cycle assessment at the urban scale: A systematic literature review. &lt;em&gt;Building and Environment&lt;/em&gt; 262: 111741. &lt;a href="https://doi.org/10.1016/j.buildenv.2024.111741" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2024.111741&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-bae-stock-review/2024-bae-stock-review.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://www.researchgate.net/profile/Wanyu-Pei-2" target="_blank" rel="noopener"&gt;Wanyu Pei&lt;/a&gt;.
Congratulations on the publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/c/1jMg51HudNFfV9" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-08-22.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/1_hu_686a9d052d848ba6.webp 400w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/1_hu_e39fd02d613988cf.webp 760w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/1_hu_e0792992e2fde82d.webp 1200w"
src="https://ual.sg/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/1_hu_686a9d052d848ba6.webp"
width="760"
height="345"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The urban building stock has a high demand for materials and energy, exerting tremendous pressure on natural resources. A current research trend is to integrate Building Material Stock (BMS) analysis with Life Cycle Assessment (LCA) to evaluate energy use, material stock/flows, and related environmental performance associated with the life cycle of building stocks. Compared with urban building energy modelling (UBEM), material-related analysis is a relatively new topic. Some studies applied new techniques and tools to improve the modelling and the dynamic evolution of the BMS system. However, there is a lack of comprehensive review studies summarising the recent publications on these applications. Therefore, this study conducts a comprehensive literature review, primarily focusing on examining the tools and techniques employed for integrating “BMS-LCA” at the urban scale. This review includes 99 articles chosen from a pool of 557 related papers (in the recent decade), systematically retrieved from Scopus and Web of Science, along with additional manual searches. Through a comprehensive bibliometric and content-based synthesis analysis of selected literature, this paper synthesises the techniques/tools used for effectively completing various analysis phases of “BMS-LCA”, including data collection and processing, stock modelling and analysis, and result evaluations. Key findings highlight the significance of integrating artificial intelligence and geospatial technology in optimising data collection, machine learning and data-driven models in enhancing building stock aggregation and classification, and innovative application of relevant LCA software and databases in facilitating BMS’s LCA, etc. This review provides a valuable reference for researchers in future investigations, as it identifies novel ways of applying techniques/tools and opportunities for methodology improvement in the urban-level “BMS-LCA” study.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/2_hu_8e19269a086b35d7.webp 400w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/2_hu_391370608f485880.webp 760w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/2_hu_5d581e17a88e8e7e.webp 1200w"
src="https://ual.sg/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/2_hu_8e19269a086b35d7.webp"
width="760"
height="362"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/3_hu_908f96d5ea92f50.webp 400w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/3_hu_e9918a5dbb7941bf.webp 760w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/3_hu_f97a4a09fd20cab2.webp 1200w"
src="https://ual.sg/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/3_hu_908f96d5ea92f50.webp"
width="760"
height="333"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-bae-stock-review/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-bae-stock-review/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/page-one_hu_6c86835d1ce20e20.webp 400w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/page-one_hu_302eaec957ea615e.webp 760w,
/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/page-one_hu_d2ca90027f6952.webp 1200w"
src="https://ual.sg/post/2024/07/03/new-paper-techniques-and-tools-for-integrating-building-material-stock-analysis-and-life-cycle-assessment-at-the-urban-scale/page-one_hu_6c86835d1ce20e20.webp"
width="582"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_bae_stock_review&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Pei, Wanyu and Biljecki, Filip and Stouffs, Rudi}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2024.111741}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{111741}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Techniques and tools for integrating building material stock analysis and life cycle assessment at the urban scale: A systematic literature review}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{262}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Guest lectures at Peking University and Tsinghua University</title><link>https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/</link><pubDate>Sat, 22 Jun 2024 18:01:29 +0800</pubDate><guid>https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/</guid><description>&lt;p&gt;The PI of the Urban Analytics Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; and PhD researcher &lt;a href="https://ual.sg/author/zicheng-fan/"&gt;Zicheng Fan&lt;/a&gt;, have visited and presented research at Peking University and Tsinghua University in Beijing, China 🇨🇳.&lt;/p&gt;
&lt;p&gt;The guest lectures were hosted by &lt;a href="https://irsgis.pku.edu.cn/english/facultystaff/gis/zhangfan/index.htm" target="_blank" rel="noopener"&gt;Fan Zhang&lt;/a&gt; at the &lt;a href="https://irsgis.pku.edu.cn/english/index.htm" target="_blank" rel="noopener"&gt;PKU Institute of Remote Sensing and GIS&lt;/a&gt; and &lt;a href="http://www.arch.tsinghua.edu.cn/info/FUrban%20Planning%20and%20Design/1760" target="_blank" rel="noopener"&gt;Ying Long&lt;/a&gt; at &lt;a href="http://www.arch.tsinghua.edu.cn/column/Home" target="_blank" rel="noopener"&gt;Tsinghua&amp;rsquo;s School of Architecture&lt;/a&gt; and &lt;a href="https://www.beijingcitylab.com/" target="_blank" rel="noopener"&gt;Beijing City Lab&lt;/a&gt;, renowned institutions in their respective fields.&lt;/p&gt;
&lt;p&gt;It was a pleasure to be in the company of exceptional scholars and learn more about their work.&lt;/p&gt;
&lt;p&gt;Many thanks to collaborators and hosts, and everyone else for the organisation and great hospitality.
We look forward to continue collaborating with these wonderful research groups.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/1_hu_88d44ee5df3eab7e.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/1_hu_58dcf714d931f8c1.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/1_hu_46b2464fdf3a9a0f.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/1_hu_88d44ee5df3eab7e.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/2_hu_43f546c659d2c0ab.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/2_hu_3ba29adbe683fde4.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/2_hu_3ce62b89afc1487.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/2_hu_43f546c659d2c0ab.webp"
width="760"
height="584"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/3_hu_f3fb54128cd310c0.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/3_hu_4568d8b5038bd879.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/3_hu_19ab4b3861ad8e4d.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/3_hu_f3fb54128cd310c0.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/4_hu_bc5e002baf2a8dc5.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/4_hu_db005ddb7d0080ea.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/4_hu_3393534ebf85c459.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/4_hu_bc5e002baf2a8dc5.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/5_hu_74d88771cf55fd64.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/5_hu_32a5a18c6ab8531a.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/5_hu_b62291010f575d83.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/5_hu_74d88771cf55fd64.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/6_hu_e6ff4b4b4c6c1b85.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/6_hu_f12721fe92c84c2c.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/6_hu_92509eef5bf65f85.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/6_hu_e6ff4b4b4c6c1b85.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/7_hu_93ffca64093cb7aa.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/7_hu_125f6eb8ed21e1d8.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/7_hu_3bcb484ade988bb4.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/7_hu_93ffca64093cb7aa.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/8_hu_dcbbabc28bba9cf5.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/8_hu_8838724c75737c25.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/8_hu_b4e613262d2dfc31.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/8_hu_dcbbabc28bba9cf5.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/9_hu_930d1e54e75dce44.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/9_hu_8ca20cc7c10d1800.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/9_hu_ec575b59678153af.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/9_hu_930d1e54e75dce44.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/10_hu_118a2cfd64db81c6.webp 400w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/10_hu_f8485504f4e7d4cb.webp 760w,
/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/10_hu_5d079acc12aef997.webp 1200w"
src="https://ual.sg/post/2024/06/22/guest-lectures-at-peking-university-and-tsinghua-university/10_hu_118a2cfd64db81c6.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Understanding Urban Perception with Visual Data</title><link>https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/</link><pubDate>Sat, 22 Jun 2024 12:05:22 +0800</pubDate><guid>https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Ito K, Kang Y, Zhang Y, Zhang F, Biljecki F (2024): Understanding Urban Perception with Visual Data: A Systematic Review. Cities, 152: 105169. &lt;a href="https://doi.org/10.1016/j.cities.2024.105169" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2024.105169&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-cities-perception-rev/2024-cities-perception-rev.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/1_hu_abd9e8c9665e70a.webp 400w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/1_hu_8d55dbea79353488.webp 760w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/1_hu_c60a0935280ace64.webp 1200w"
src="https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/1_hu_abd9e8c9665e70a.webp"
width="745"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1jIcAy5jOuwFw" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-08-10.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Comprehensive and overarching review on 393 urban visual perception studies&lt;/li&gt;
&lt;li&gt;Novel approach to semi-automate the systematic review with NLP and LLM models&lt;/li&gt;
&lt;li&gt;Identified six dominant categories (e.g., greenery and water and street design)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/2_hu_33d982a9fdc1e3d2.webp 400w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/2_hu_537d3280c9e0cbe6.webp 760w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/2_hu_7e2f3e6304ab0bfa.webp 1200w"
src="https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/2_hu_33d982a9fdc1e3d2.webp"
width="760"
height="640"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Visual characteristics of the built environment affect how people perceive and experience cities. For a long time, many studies have examined visual perception in cities. Such efforts have accelerated in recent years due to advancements in technologies and the proliferation of relevant data (e.g., street view imagery, geo-tagged photos, videos, virtual reality, and aerial imagery). There has not been a comprehensive systematic review paper on this topic to reveal an overarching set of research trends, limitations, and future research opportunities. Such omission is plausibly due to the difficulty in reviewing a large number of relevant papers on this popular topic. In this study, we utilized machine learning techniques (i.e., natural language processing and large language models) to semi-automate the review process and reviewed 393 relevant papers. Through the review, we found that these papers can be categorized into the physical aspects of cities: greenery and water, street design, building design, landscape, public space, and the city as a whole. We also revealed that many studies conducted quantitative analyses with a recent trend of increasingly utilizing big data and advanced technologies, such as combinations of street view imagery and deep learning models. Limitations and research gaps were also identified as follows: (1) a limited scope in terms of study areas, sample size, and attributes; (2) low quality of subjective and visual data; and (3) the need for more controlled and sophisticated methods to infer more closely examined impacts of visual features on human perceptions. We suggest that future studies utilize and contribute to open data and take advantage of existing data and technologies to examine the causality of visual features on human perception. The approach developed to accelerate this review proved to be accurate, efficient, and insightful. Considering its novelty, we also describe it to enable replications in the future.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/3_hu_b96d98a927554591.webp 400w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/3_hu_92a86bb22e751a06.webp 760w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/3_hu_40311a7de1e23c56.webp 1200w"
src="https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/3_hu_b96d98a927554591.webp"
width="374"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/4_hu_bced68d6a7063afd.webp 400w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/4_hu_f7abb18c76478cdd.webp 760w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/4_hu_458f9d61e29df67f.webp 1200w"
src="https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/4_hu_bced68d6a7063afd.webp"
width="492"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-cities-perception-rev/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-cities-perception-rev/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/page-one_hu_64d4edfdbb48ca66.webp 400w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/page-one_hu_40abbcb356212f7.webp 760w,
/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/page-one_hu_9a5e00bc66e57b22.webp 1200w"
src="https://ual.sg/post/2024/06/22/new-paper-understanding-urban-perception-with-visual-data/page-one_hu_64d4edfdbb48ca66.webp"
width="565"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_cities_perception_rev&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ito, Koichi and Kang, Yuhao and Zhang, Ye and Zhang, Fan and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2024.105169}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{105169}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Understanding urban perception with visual data: A systematic review}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{152}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Our university remains ranked as one of the best in the world 🌏</title><link>https://ual.sg/post/2024/06/21/our-university-remains-ranked-as-one-of-the-best-in-the-world/</link><pubDate>Fri, 21 Jun 2024 12:03:29 +0800</pubDate><guid>https://ual.sg/post/2024/06/21/our-university-remains-ranked-as-one-of-the-best-in-the-world/</guid><description>&lt;p&gt;The National University of Singapore has retained its position as eighth in the world and best in entire Asia, according to results of the latest Quacquarelli Symonds (QS) World University Rankings (WUR) 2025.&lt;/p&gt;
&lt;p&gt;The ranking is the same as last year, continuing the trend that NUS is the first and only Asian university that has reached top 10 globally, joining the likes of MIT, Stanford, Harvard, Cambridge, Oxford, ETH Zurich, Berkeley&amp;hellip;&lt;/p&gt;
&lt;p&gt;We are humbled to be part of this incredible journey of our National University of Singapore &amp;amp; NUS Department of Architecture, reflecting our commitment to excellence in research and education. 🏅🚀&lt;/p&gt;
&lt;p&gt;We include below the &lt;a href="https://news.nus.edu.sg/nus-at-world-no-8-and-top-in-asia-in-qs-world-university-rankings-2025/" target="_blank" rel="noopener"&gt;press release&lt;/a&gt; by the University.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="nus-at-world-no-8-and-top-in-asia-in-qs-world-university-rankings-2025"&gt;NUS at world No. 8 and top in Asia in QS World University Rankings 2025&lt;/h3&gt;
&lt;p&gt;NUS has reaffirmed its strong reputation as a global leader in higher education, placing eighth in the world and first in Asia, based on the latest Quacquarelli Symonds (QS) World University Rankings (WUR) 2025 released on 4 June 2024.&lt;/p&gt;
&lt;p&gt;This year, the annual comparative ranking features 1,500 universities across 106 locations, including four universities from Singapore. The latest ranking results position NUS among the top 0.5% of universities worldwide and the only Asian university among the top 10.&lt;/p&gt;
&lt;p&gt;“Higher education in Singapore has transformed significantly as universities expand the space for interdisciplinary teaching and research with greater pace and scale, so that it becomes second nature for graduates to be critical thinkers and flexible learners, while faculty and researchers join hands and bridge discipline silos to solve the ill-defined world problems of today. As a leading global university, NUS continues to push the boundaries in education, research, innovation and enterprise, challenging students to learn and unlearn, grow in adaptability and resilience, and discover a world beyond themselves,” said NUS President Professor Tan Eng Chye.&lt;/p&gt;
&lt;p&gt;Prof Tan attributed NUS’ strong performance to the collective achievement of the University’s outstanding faculty, staff, students, and alumni.&lt;/p&gt;
&lt;p&gt;“This year, most notably, NUS has made considerable progress in the Sustainability indicator, a testament to our whole-of-university approach to shape a sustainable future through interdisciplinary solutions across education, research, and campus operations. The NUS community remains resolute in our commitment to foster a vibrant and dynamic academic environment which inspires and drives positive impact for everyone,” he added.&lt;/p&gt;
&lt;h4 id="consistently-strong-performance-across-key-indicators"&gt;Consistently strong performance across key indicators&lt;/h4&gt;
&lt;p&gt;Among the nine indicators used to derive the world rankings, NUS ranks among the world’s top 50 in five indicators. The University is also one of Asia’s top-performing universities in three indicators.&lt;/p&gt;
&lt;p&gt;In Academic Reputation, NUS ranks 15th globally, making it the third highest in Asia behind The University of Tokyo and Peking University. In Employment Outcomes, NUS excels by placing 6th globally, second only to Seoul National University. For Sustainability, NUS ranks 26th globally, after The University of Tokyo, which is ranked 22nd. In Singapore, NUS emerged the top performer in eight out of nine indicators.&lt;/p&gt;
&lt;p&gt;The full QS World University Rankings 2025 results are available at: &lt;a href="https://www.topuniversities.com/world-university-rankings" target="_blank" rel="noopener"&gt;https://www.topuniversities.com/world-university-rankings&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>UAL at the World Cities Summit 2024</title><link>https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/</link><pubDate>Thu, 20 Jun 2024 19:39:19 +0800</pubDate><guid>https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/</guid><description>&lt;p&gt;A number of our members, alumni, and associated researchers &amp;ndash; &lt;a href="https://ual.sg/author/yixin-wu/"&gt;Yixin Wu&lt;/a&gt;, &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;, &lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt;, &lt;a href="https://ual.sg/author/chenyi-cai/"&gt;Chenyi Cai&lt;/a&gt;, &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt; and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; &amp;ndash; have participated in this year&amp;rsquo;s &lt;a href="https://www.worldcitiessummit.com.sg" target="_blank" rel="noopener"&gt;World Cities Summit&lt;/a&gt; in Singapore.&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://www.worldcitiessummit.com.sg" target="_blank" rel="noopener"&gt;World Cities Summit (WCS)&lt;/a&gt; is a biennial event for government leaders and industry experts to address liveable and sustainable city challenges, share integrated urban solutions and forge new partnerships.
It is jointly organised by Singapore&amp;rsquo;s &lt;a href="https://www.clc.gov.sg" target="_blank" rel="noopener"&gt;Centre for Liveable Cities (CLC)&lt;/a&gt; and the &lt;a href="https://www.ura.gov.sg" target="_blank" rel="noopener"&gt;Urban Redevelopment Authority (URA)&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;We presented our work and participated in lots of activities.&lt;/p&gt;
&lt;p&gt;At the &lt;a href="https://www.worldcitiessummit.com.sg/programme/wcs-young-leaders" target="_blank" rel="noopener"&gt;Young Leaders Symposium&lt;/a&gt;, our PhD candidate &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt; connected with fellow emerging urban innovators.
This event offered a distinct platform for networking and engaging with a diverse panel of experts from government, academia, and private sectors, fostering valuable interdisciplinary interactions.
The current Young Leaders network and alumni stand at 550+ strong across continents, with members from both public and private sectors such as Mikko Kiesiläinen, Chief Economist for the City of Helsinki, Finland, Sean Tan, Co-Founder, Insect Feed Technologies, Estibaliz Luengo Celaya, Director, International Department, Bilbao City Council, and Brice Richard, Strategic Advisory and Smart Cities Lead, Arup.&lt;/p&gt;
&lt;p&gt;Our principal investigator &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; presented one of our latest projects at the &lt;a href="https://www.worldcitiessummit.com.sg/programme/wcs-science-of-cities-symposium" target="_blank" rel="noopener"&gt;Science of Cities Symposium&lt;/a&gt; and served as panelist in the panel on Science of People-Centric Cities.
The Symposium was attended by more than 200 attendees from government agencies, researchers, industry, both locally and internationally.
The programme, together with abstracts and posters, can be found &lt;a href="https://www.worldcitiessummit.com.sg/programme/wcs-science-of-cities-symposium" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Below are some photos, courtesy of CLC/URA and our researchers and lab friends.
Many thanks to everyone at CLC and URA for making this event a success and for having us &amp;mdash; it was perfectly organised. 👏&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/1_hu_7e37f6e971ac9b22.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/1_hu_190fa11fbd5ca5df.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/1_hu_668bc8b9307af96a.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/1_hu_7e37f6e971ac9b22.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/2_hu_9a2df916811be2f9.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/2_hu_4d7182542820d890.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/2_hu_93e4bf29c1ce00ec.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/2_hu_9a2df916811be2f9.webp"
width="760"
height="506"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/3_hu_a792727bc1b6599a.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/3_hu_65a20d15ccd73388.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/3_hu_741f0b9440015d6.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/3_hu_a792727bc1b6599a.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/4_hu_82f330e577e13ccc.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/4_hu_edb046ced7526c6.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/4_hu_b4573018b2eb6c7f.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/4_hu_82f330e577e13ccc.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/5_hu_edd3509efb5d6e1e.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/5_hu_5b89f138d2fae976.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/5_hu_3837a690a412b08c.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/5_hu_edd3509efb5d6e1e.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/6_hu_b9b6012d4fa30553.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/6_hu_23d72d10bcb88ae1.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/6_hu_11bad0ceec5654df.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/6_hu_b9b6012d4fa30553.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/7_hu_102dba40a2c2552a.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/7_hu_31546297c6da2e5a.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/7_hu_a4c158a39c2f76e6.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/7_hu_102dba40a2c2552a.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/8_hu_8438188df5ef13c2.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/8_hu_d1e9d68c3637a053.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/8_hu_b525e894347a9b0d.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/8_hu_8438188df5ef13c2.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/20/ual-at-the-world-cities-summit-2024/9_hu_511aa16e8324e6a4.webp 400w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/9_hu_107ebb2d3ef75e7d.webp 760w,
/post/2024/06/20/ual-at-the-world-cities-summit-2024/9_hu_344c454f405b1754.webp 1200w"
src="https://ual.sg/post/2024/06/20/ual-at-the-world-cities-summit-2024/9_hu_511aa16e8324e6a4.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>We are organising a Special Issue in EPB on Urban AI</title><link>https://ual.sg/post/2024/06/16/we-are-organising-a-special-issue-in-epb-on-urban-ai/</link><pubDate>Sun, 16 Jun 2024 21:39:19 +0800</pubDate><guid>https://ual.sg/post/2024/06/16/we-are-organising-a-special-issue-in-epb-on-urban-ai/</guid><description>&lt;p&gt;Together with collaborators from Germany and USA, we are organising a special issue in &lt;a href="https://journals.sagepub.com/home/EPB" target="_blank" rel="noopener"&gt;Environment and Planning B: Urban Analytics and City Science&lt;/a&gt;, on the topic of &lt;em&gt;Urban AI for a Sustainable Built Environment&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Here is the summary of the call for papers:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Recently, Urban AI has become an emerging field that combines AI, spatial computing, and urban science to address complex challenges. This special issue aims to promote the development of urban AI with multimodal geospatial data collected from satellites, street view imagery, and loT devices to enable evidence-based decision-making for built-environment management and urban infrastructure modelling. Novel research in this direction will address pressing and essential challenges in urban environments, ranging from sustainable urban planning and smart mobility design to built-environment management, public health, urban land use, urban disaster management, and AI-assisted humanitarian mapping to help fight extreme heat and mitigate the impacts of climate change.
Keywords: Urban AI, Built-environment, Geospatial Big data, Built-environment, Informed-decision making, Volunteered Geographic Information&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The special issue is edited by Steffen Knoblauch (Heidelberg University, Germany), Hao Li (Technical University of Munich, Germany), Filip Biljecki (National University of Singapore, Singapore), Wenwen Li (Arizona State University, USA), and Alexander Zipf (Heidelberg University, Germany).&lt;/p&gt;
&lt;p&gt;For the full CfP see &lt;a href="https://journals.sagepub.com/pb-assets/cmscontent/epb/Special%20Issue%20for%20Environment%20and%20Planning%20B%20-%20GeoAI%20%281%29-1718340150.pdf" target="_blank" rel="noopener"&gt;here&lt;/a&gt;, while for all past and ongoing SIs in EPB check &lt;a href="https://journals.sagepub.com/page/epb/collections/special-issues" target="_blank" rel="noopener"&gt;this page&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The submission deadline is in December 2024.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://journals.sagepub.com/overview-metric/EPB" target="_blank" rel="noopener"&gt;Environment and Planning B: Urban Analytics and City Science&lt;/a&gt; is the leading journal for the publication of high-quality articles that present cutting-edge research in analytical methods for urban planning and design. The journal focuses on smart cities, urban analytics, GIS, and urban simulation models. It also deals with visualisation, computation, and formal design-based methods applicable to morphological processes and structures in cities and regions.&lt;/p&gt;
&lt;p&gt;Why submit to this special issue?
EPB is a very well regarded journal in the urban analytics community, and it has published impactful papers in the community.
It has been one of our journals of choice, and we published a couple of papers there recently, e.g. &lt;a href="https://ual.sg/publication/2024-epb-xai/"&gt;here&lt;/a&gt; and &lt;a href="https://ual.sg/publication/2023-epb-semantic-networks/"&gt;here&lt;/a&gt;.
Further, this special issue is on a unique scope that may be suitable for a variety of submissions.
Finally, EPB publishes papers that are a bit shorter than those in most of other journals, and it also allows Urban Data/Code papers instead of only traditional types of articles that are a great opportunity to showcase your open-source software or open data products.&lt;/p&gt;
&lt;p&gt;We look forward to receiving your great work and learn what you have been up to!
For further information about the submissions and guide for authors, please check the &lt;a href="https://journals.sagepub.com/home/EPB" target="_blank" rel="noopener"&gt;journal&amp;rsquo;s website&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>New paper: The State of the Art in Visual Analytics for 3D Urban Data</title><link>https://ual.sg/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/</link><pubDate>Sat, 15 Jun 2024 09:44:34 +0800</pubDate><guid>https://ual.sg/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Miranda F, Ortner T, Moreira G, Hosseini M, Vuckovic M, Biljecki F, Silva CT, Lage M, Ferreira N (2024): The State of the Art in Visual Analytics for 3D Urban Data. &lt;em&gt;Computer Graphics Forum&lt;/em&gt; 43(3): e15112. &lt;a href="https://doi.org/10.1111/cgf.15112" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1111/cgf.15112&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-cgf-star-3-dviz/2024-cgf-star-3-dviz.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://fmiranda.me" target="_blank" rel="noopener"&gt;Fabio Miranda&lt;/a&gt; from the Department of Computer Science at the University of Illinois Chicago.
It was conducted in collaboration with academics from other institutions in the USA, Austria, and Brazil: MIT, VRVis, NYU, UFF, and UFPE.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://doi.org/10.1111/cgf.15112" target="_blank" rel="noopener"&gt;available open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-example-of-papers-from-each-one-of-the-types-considered-in-the-survey-for-the-sources-please-check-the-paper"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Example of papers from each one of the types considered in the survey. For the sources, please check the paper." srcset="
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/1_hu_be3dce858419ddaf.webp 400w,
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/1_hu_c9e99b9064d3dfd1.webp 760w,
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/1_hu_cb27de774a71a852.webp 1200w"
src="https://ual.sg/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/1_hu_be3dce858419ddaf.webp"
width="760"
height="116"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Example of papers from each one of the types considered in the survey. For the sources, please check the paper.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Urbanization has amplified the importance of three-dimensional structures in urban environments for a wide range of phenomena that are of significant interest to diverse stakeholders. With the growing availability of 3D urban data, numerous studies have focused on developing visual analysis techniques tailored to the unique characteristics of urban environments. However, incorporating the third dimension into visual analytics introduces additional challenges in designing effective visual tools to tackle urban data&amp;rsquo;s diverse complexities. In this paper, we present a survey on visual analytics of 3D urban data. Our work characterizes published works along three main dimensions (why, what, and how), considering use cases, analysis tasks, data, visualizations, and interactions. We provide a fine-grained categorization of published works from visualization journals and conferences, as well as from a myriad of urban domains, including urban planning, architecture, and engineering. By incorporating perspectives from both urban and visualization experts, we identify literature gaps, motivate visualization researchers to understand challenges and opportunities, and indicate future research directions.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure id="figure-distribution-of-surveyed-papers-according-to-why-and-how-dimensions-with-shades-denoting-tag-occurrence"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Distribution of surveyed papers according to Why and How dimensions, with shades denoting tag occurrence." srcset="
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/2_hu_581c21a5d6e317d9.webp 400w,
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/2_hu_5c700b731d347104.webp 760w,
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/2_hu_fff42d164a6b15c5.webp 1200w"
src="https://ual.sg/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/2_hu_581c21a5d6e317d9.webp"
width="760"
height="439"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Distribution of surveyed papers according to Why and How dimensions, with shades denoting tag occurrence.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-cgf-star-3-dviz/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-cgf-star-3-dviz/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/page-one_hu_373e9d9f116f9226.webp 400w,
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/page-one_hu_58ae24d6c8692280.webp 760w,
/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/page-one_hu_dc2a824cdfaf7203.webp 1200w"
src="https://ual.sg/post/2024/06/15/new-paper-the-state-of-the-art-in-visual-analytics-for-3d-urban-data/page-one_hu_373e9d9f116f9226.webp"
width="545"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_cgf_star_3dviz&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Miranda, Fabio and Ortner, Thomas and Moreira, Gustavo and Hosseini, Maryam and Vuckovic, Milena and Biljecki, Filip and Silva, Claudio T. and Lage, Marcos and Ferreira, Nivan}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1111/cgf.15112}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computer Graphics Forum}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{3}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{The State of the Art in Visual Analytics for 3D Urban Data}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{43}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Predicting building characteristics at urban scale using graph neural networks and street-level context</title><link>https://ual.sg/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/</link><pubDate>Mon, 20 May 2024 16:05:22 +0800</pubDate><guid>https://ual.sg/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Liu P, Milojevic-Dupont N, Biljecki F (2024): Predicting building characteristics at urban scale using graph neural networks and street-level context. Computers, Environment and Urban Systems, 111: 102129. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2024.102129" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2024.102129&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-ceus-gnn-building/2024-ceus-gnn-building.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/1_hu_bd4ad1a143c55994.webp 400w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/1_hu_c24b2e7015d75c02.webp 760w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/1_hu_1ea357de305b4a3a.webp 1200w"
src="https://ual.sg/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/1_hu_bd4ad1a143c55994.webp"
width="640"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1j6Z3jFQh0YjD" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-07-07.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A graph-based spatially explicit GeoAI framework to complete building semantics.&lt;/li&gt;
&lt;li&gt;Novel graph representation of urban objects for predicting building characteristics.&lt;/li&gt;
&lt;li&gt;Cross-validation on the usability of framework across different cities.&lt;/li&gt;
&lt;li&gt;Robust predictive capabilities despite insufficient data available.&lt;/li&gt;
&lt;li&gt;Potential use case for estimating building volume and generating a 3D city model.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/2_hu_37bb28c9d53704a1.webp 400w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/2_hu_3f573e7b8564ba73.webp 760w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/2_hu_28edc24177ee69a6.webp 1200w"
src="https://ual.sg/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/2_hu_37bb28c9d53704a1.webp"
width="760"
height="454"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Building characteristics, such as number of storeys and type, play a key role across many domains: interpreting urban form, simulating urban microclimate or modelling building energy. However, geospatial data on the building stock is often fragmented and incomplete. Here, we propose a novel and easily adaptable method to predict building characteristics in diverse cities, which attempts to mitigate such data gaps. Our method exploits the geospatial connectivity between street-level urban objects and building characteristics by employing graph neural networks, as they can model spatial relationships and leverage them for predictions. We apply this approach in three representative cities (Boston, Melbourne, and Helsinki) that offer a variety of building features as prediction targets (storeys, types, construction period and materials) and diverse urban environments as predictors. Overall, the magnitude of errors is acceptable for a series of use cases. In the prediction of building storeys, an average of 81.83% buildings in three cities have less than one-storey prediction error. We also find that the prediction of building type achieves an average of 88.33% accuracy across three cities. Meanwhile, an average of 70.5% of buildings are correctly classified by construction period in Melbourne and Helsinki, and the building material prediction accuracy is 68% in Helsinki. The results confirm that our approach is adaptable across different urban environments because comparable performance is achieved in the other two cities. Further, we assess the impact of varying local data availability on model performance. Our findings underscore the feasibility of the method in scenarios with sparse building data (10%, 30% and 50% availability). Our graph-based approach advances research on filling in incomplete building semantics from existing datasets, and showcases the potential to enable 3D city modelling. Given the broad applicability of the approach to predicting many building characteristics, diverse downstream applications exist, such as enhancing contemporary urban studies (e.g. exploring streetscapes) and facilitating the development of 3D GIS (e.g. maintaining and updating 3D building settings).&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/3_hu_26be9a23282ba929.webp 400w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/3_hu_184d0aa2b59fb9e2.webp 760w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/3_hu_b59701a6c5daca2a.webp 1200w"
src="https://ual.sg/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/3_hu_26be9a23282ba929.webp"
width="676"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-ceus-gnn-building/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-ceus-gnn-building/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/page-one_hu_d104fea3caf4d78.webp 400w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/page-one_hu_1666b67d0cc3b0e4.webp 760w,
/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/page-one_hu_deb3ae48094bcb80.webp 1200w"
src="https://ual.sg/post/2024/05/20/new-paper-predicting-building-characteristics-at-urban-scale-using-graph-neural-networks-and-street-level-context/page-one_hu_d104fea3caf4d78.webp"
width="586"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_ceus_gnn_building&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Liu, Pengyuan and Milojevic-Dupont, Nikola and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2024.102129}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102129}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Predicting building characteristics at urban scale using graph neural networks and street-level context}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{111}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visit by Prof Yuhao Kang from the University of South Carolina &amp; University of Texas at Austin</title><link>https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/</link><pubDate>Mon, 20 May 2024 01:39:19 +0800</pubDate><guid>https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/</guid><description>&lt;p&gt;Our Lab hosted Dr &lt;a href="http://www.kkyyhh96.site/" target="_blank" rel="noopener"&gt;Yuhao Kang&lt;/a&gt;, Assistant Professor at the University of South Carolina and University of Texas at Austin. 🇺🇸&lt;/p&gt;
&lt;p&gt;Dr. Yuhao Kang is leading the GISense Lab. He was a postdoctoral researcher at MIT and received his Ph.D. from the University of Wisconsin-Madison. He had working experience at Google X and MoBike. Dr. Kang’s research mainly focuses on Human-centered Geospatial Data Science, including understanding human subjective experience at place and develop ethical and responsible geospatial artificial intelligence (GeoAI) approaches. By leveraging human-centered geospatial data science, Dr. Kang’s work has benefited various applications in public health, real estate, crime, and urban planning. His papers have been published on Landscape and Urban Planning, IJGIS, Cities, PNAS, etc. He was the recipient for several fellowships and awards, including the Young Researcher Award by the Austrian Academy of Sciences, CaGIS Rising Award, CaGIS scholarship, ICA scholarship, etc. He actively contributed to the GIScience community. He founded the non-profit global education project GISphere with over 20,000 members, served as the associate editor of Computational Urban Science, and board members for the AAG GISS/CyberGIS/Cartography groups and CPGIS.&lt;/p&gt;
&lt;p&gt;During his stay, besides several collaborative exchanges such as discussion sessions and meetings, Yuhao delivered the lecture &lt;em&gt;Advancing Sense of Place with Human-centered Geospatial Data Science&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/1_hu_899a341e76fcfc2f.webp 400w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/1_hu_61d5100f1ce13965.webp 760w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/1_hu_2da255cc86086fc7.webp 1200w"
src="https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/1_hu_899a341e76fcfc2f.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/2_hu_34ce47bc96657896.webp 400w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/2_hu_7be1c34c97e6d615.webp 760w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/2_hu_5ac6519e26663e45.webp 1200w"
src="https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/2_hu_34ce47bc96657896.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/3_hu_466a5c914785bf1b.webp 400w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/3_hu_be1c11ec23c558e7.webp 760w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/3_hu_aa72c4d8d264dd43.webp 1200w"
src="https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/3_hu_466a5c914785bf1b.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/4_hu_66348d3efc57b7e3.webp 400w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/4_hu_e76ade40d2ccbf77.webp 760w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/4_hu_ee9ad9454010d74d.webp 1200w"
src="https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/4_hu_66348d3efc57b7e3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/5_hu_9d13a954ea25b8eb.webp 400w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/5_hu_83ba6a8865aa5845.webp 760w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/5_hu_3a057007684ecf55.webp 1200w"
src="https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/5_hu_9d13a954ea25b8eb.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/poster_hu_60d28ed7e05f5044.webp 400w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/poster_hu_e757092a617fe699.webp 760w,
/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/poster_hu_69625a513c638e0d.webp 1200w"
src="https://ual.sg/post/2024/05/20/visit-by-prof-yuhao-kang-from-the-university-of-south-carolina-university-of-texas-at-austin/poster_hu_60d28ed7e05f5044.webp"
width="538"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Human sense of place refers to how we perceive, experience, and interact with a particular location and environment. The emergence of Geospatial Data Science – the use of geographic knowledge and AI approaches to extract meaningful insights from large-scale geographic data – has achieved remarkable success not only in modeling physical geographic phenomena but also in advancing human subjective experiences at place. In this talk, Dr. Kang will present a series of works that utilize geospatial data science to understand human experience and sense of place. First, utilizing eye-tracking systems, his work delved into human subjective safety perceptions (e.g., whether a neighborhood is perceived as a safe place) to identify physical objects that attract human attention from street view images. Second, utilizing large language models (LLMs), his work proposed a Soundscape-to-Image Diffusion model to visualize and translate human auditory perceptions into visual representations of place. His work demonstrated how human multi-sensory experiences can be linked to comprehensively understand human sense of place using Generative AI. Finally, he will share his multifaceted experiences in GISphere that aim to promote global GIScience education.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>Filip Biljecki joins the Editorial Board of Scientific Data</title><link>https://ual.sg/post/2024/05/19/filip-biljecki-joins-the-editorial-board-of-scientific-data/</link><pubDate>Sun, 19 May 2024 13:39:19 +0800</pubDate><guid>https://ual.sg/post/2024/05/19/filip-biljecki-joins-the-editorial-board-of-scientific-data/</guid><description>&lt;p&gt;Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, the principal investigator of our research group, joins the Editorial Board of &lt;a href="https://www.nature.com/sdata/" target="_blank" rel="noopener"&gt;Scientific Data&lt;/a&gt;, a top journal that is part of the &lt;a href="https://www.nature.com/nature-portfolio" target="_blank" rel="noopener"&gt;Nature Portfolio&lt;/a&gt;.
&lt;a href="https://www.nature.com/sdata/journal-information" target="_blank" rel="noopener"&gt;It is a peer-reviewed, open-access journal for descriptions of datasets, and research that advances the sharing and reuse of scientific data&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Our Lab is committed to open science, and &lt;a href="https://ual.sg/data-code/"&gt;often releases data and software outputs openly&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We have a couple of new exciting open datasets and open-source software packages coming out soon, stay tuned for the announcements!&lt;/p&gt;</description></item><item><title>New paper: G2Viz: an online tool for visualizing and analyzing a public transit system from GTFS data</title><link>https://ual.sg/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/</link><pubDate>Thu, 09 May 2024 10:23:16 +0800</pubDate><guid>https://ual.sg/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Para S, Wirotsasithon T, Jundee T, Demissie MG, Sekimoto Y, Biljecki F, Phithakkitnukoon S (2024): G2Viz: an online tool for visualizing and analyzing a public transit system from GTFS data. &lt;em&gt;Public Transport&lt;/em&gt; 16(3): 893-928. &lt;a href="https://doi.org/10.1007/s12469-024-00362-x" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1007/s12469-024-00362-x&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-pt-g-2-viz/2024-pt-g-2-viz.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by the group of &lt;a href="https://cpemis.eng.cmu.ac.th/~santi/" target="_blank" rel="noopener"&gt;Santi Phithakkitnukoon&lt;/a&gt; from Chiang Mai University.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://rdcu.be/dHocM" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;G2Viz can be accessed &lt;a href="https://g2viz.citycontext.info" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/1_hu_3a3cf3155e6d6a4a.webp 400w,
/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/1_hu_a0b7fe48126a29ca.webp 760w,
/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/1_hu_9943c01f02402d66.webp 1200w"
src="https://ual.sg/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/1_hu_3a3cf3155e6d6a4a.webp"
width="627"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Public transit agencies have amassed substantial data through on-board and off-board sensors over the years. While data collection was the primary focus, there is now a shift towards deriving actionable insights from this wealth of information. As data-driven decision making becomes increasingly vital, there is a growing need for effective ways to visualize and convey complex insights to decision makers. This study addresses this need by introducing G2Viz, a visualizer for public transit operations. The development process of G2Viz spans requirement gathering, planning, and design, encompassing software architecture, data models, user interfaces, and system components. Rigorous implementation and testing ensure the tool’s functionality and effectiveness. G2Viz, designed to dynamically visualize public transit operations using General Transit Feed Specification (GTFS) data, is a web application accessible globally via any web browser. Its open-source nature, robustness, and versatility facilitate communication among transit agencies, users, researchers, and city authorities. G2Viz empowers transit planners to make well-informed decisions about public transportation. (Access G2Viz at &lt;a href="https://g2viz.citycontext.info" target="_blank" rel="noopener"&gt;https://g2viz.citycontext.info&lt;/a&gt;).&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-pt-g-2-viz/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-pt-g-2-viz/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/page-one_hu_f1e3716b19e6470f.webp 400w,
/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/page-one_hu_fdc2fb3f9a856f25.webp 760w,
/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/page-one_hu_d6d47d7f4b6e2d6e.webp 1200w"
src="https://ual.sg/post/2024/05/09/new-paper-g2viz-an-online-tool-for-visualizing-and-analyzing-a-public-transit-system-from-gtfs-data/page-one_hu_f1e3716b19e6470f.webp"
width="475"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_pt_g2viz&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Para, Sirapop and Wirotsasithon, Thanachok and Jundee, Thanisorn and Demissie, Merkebe Getachew and Sekimoto, Yoshihide and Biljecki, Filip and Phithakkitnukoon, Santi}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1007/s12469-024-00362-x}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Public Transport}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{16}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{3}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{893--928}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{G2Viz: an online tool for visualizing and analyzing a public transit system from GTFS data}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Assessing governance implications of city digital twin technology: A maturity model approach</title><link>https://ual.sg/post/2024/05/07/new-paper-assessing-governance-implications-of-city-digital-twin-technology-a-maturity-model-approach/</link><pubDate>Tue, 07 May 2024 10:23:16 +0800</pubDate><guid>https://ual.sg/post/2024/05/07/new-paper-assessing-governance-implications-of-city-digital-twin-technology-a-maturity-model-approach/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Haraguchi M, Funahashi T, Biljecki F (2024): Assessing governance implications of city digital twin technology: A maturity model approach. &lt;em&gt;Technological Forecasting and Social Change&lt;/em&gt; 204: 123409. &lt;a href="https://doi.org/10.1016/j.techfore.2024.123409" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.techfore.2024.123409&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-tfsc-dt-maturity/2024-tfsc-dt-maturity.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by Masahiko Haraguchi from Harvard University.&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/c/1j2nl98SG~VS3" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-06-26.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Our maturity model assesses the different development stages of city digital twins.&lt;/li&gt;
&lt;li&gt;We identify opportunities and impediments to better governance.&lt;/li&gt;
&lt;li&gt;City digital twin technologies can enhance public participation in urban planning.&lt;/li&gt;
&lt;li&gt;The technologies can resolve smart cities&amp;rsquo; long-standing challenges.&lt;/li&gt;
&lt;li&gt;Challenges remain, including interoperability, participation, and inclusivity concerns.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Digital twin technology has great potential to transform urban planning. However, the governance aspects of city-scale digital twins (CDTs)— a virtual representation of urban environments —are understudied. This study bridges this knowledge gap by adopting a framework that scrutinizes the maturity stages of technology. We introduce the CITYSTEPS Maturity Model, a pioneering maturity framework tailored for CDTs, to assess all development stages of CDTs, including those utilizing artificial intelligence, and analyze the technology&amp;rsquo;s role in urban governance. We highlight the promise of CDTs in enhancing public participation in urban planning and addressing key smart city concerns, such as accountability and transparency. However, significant challenges remain, including public participation, public trust in privacy protection, and technical impediments like inadequate data integration, systems integration, and interoperability. There&amp;rsquo;s also the pressing issue of social inclusion: the potential exclusion of marginalized groups, including those often overlooked in data collection, like the hidden homeless and informal sector workers. We propose CDTs should be designed with a human-centric approach, transparent and unbiased data collection and algorithm development, and be led by an adaptive regulatory framework. The CITYSTEPS Maturity Model lays out a framework to assess CDTs&amp;rsquo; present state, forecast their future, and understand their governance implications, promoting more inclusive technology adoption.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-tfsc-dt-maturity/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-tfsc-dt-maturity/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/05/07/new-paper-assessing-governance-implications-of-city-digital-twin-technology-a-maturity-model-approach/page-one_hu_3a1a441734225e62.webp 400w,
/post/2024/05/07/new-paper-assessing-governance-implications-of-city-digital-twin-technology-a-maturity-model-approach/page-one_hu_5d5e76ca1e2623c3.webp 760w,
/post/2024/05/07/new-paper-assessing-governance-implications-of-city-digital-twin-technology-a-maturity-model-approach/page-one_hu_57c7644f806d663d.webp 1200w"
src="https://ual.sg/post/2024/05/07/new-paper-assessing-governance-implications-of-city-digital-twin-technology-a-maturity-model-approach/page-one_hu_3a1a441734225e62.webp"
width="582"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_tfsc_dt_maturity&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Haraguchi, Masahiko and Funahashi, Tomomi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.techfore.2024.123409}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Technological Forecasting and Social Change}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{123409}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Assessing governance implications of city digital twin technology: A maturity model approach}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{204}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Mapping the landscape and roadmap of geospatial artificial intelligence (GeoAI) in quantitative human geography</title><link>https://ual.sg/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/</link><pubDate>Thu, 11 Apr 2024 10:23:16 +0800</pubDate><guid>https://ual.sg/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wang S, Huang X, Liu P, Zhang M, Biljecki F, Hu T, Fu X, Liu L, Liu X, Wang R, Huang Y, Yan J, Jiang J, Chukwu M, Reza Naghedi S, Hemmati M, Shao Y, Jia N, Xiao Z, Tian T, Hu Y, Yu L, Yap W, Macatulad E, Chen Z, Cui Y, Ito K, Ye M, Fan Z, Lei B, Bao S (2024): Mapping the landscape and roadmap of geospatial artificial intelligence (GeoAI) in quantitative human geography: An extensive systematic review. &lt;em&gt;International Journal of Applied Earth Observation and Geoinformation&lt;/em&gt; 128: 103734. &lt;a href="https://doi.org/10.1016/j.jag.2024.103734" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.jag.2024.103734&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-jag-geoai-hg/2024-jag-geoai-hg.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;It is a massive review on GeoAI in quantitative human geography. In this paper, 8 of our lab members have been involved in. Starting from an extensive corpus of 14,537 papers, 1516 of them were reviewed to outline the applications of GeoAI in this domain, together with issues and challenges, and postulating an outlook on directions and research opportunities.&lt;/p&gt;
&lt;p&gt;The review was led by Siqin (Sisi) Wang from the University of Southern California and Xiao Huang from Emory University, and it included 31 authors from 22 institutes in five countries and regions (🇺🇸🇦🇺🇨🇳🇭🇰🇸🇬).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/1_hu_c1d08de54b38bd39.webp 400w,
/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/1_hu_8456baf9cc36ea53.webp 760w,
/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/1_hu_9af92902e5b4baa0.webp 1200w"
src="https://ual.sg/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/1_hu_c1d08de54b38bd39.webp"
width="760"
height="697"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;It reviewed 1516 papers using GeoAI in human geography related studies.&lt;/li&gt;
&lt;li&gt;This review covers 14 subdomains of human geography.&lt;/li&gt;
&lt;li&gt;It elaborates on the current progress and status of GeoAI applications within each subdomain.&lt;/li&gt;
&lt;li&gt;It points out the issues and challenges for using GeoAI in future human geography studies.&lt;/li&gt;
&lt;li&gt;It proposes the directions and research opportunities for future studies.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;This paper brings a comprehensive systematic review of the application of geospatial artificial intelligence (GeoAI) in quantitative human geography studies, including the subdomains of cultural, economic, political, historical, urban, population, social, health, rural, regional, tourism, behavioural, environmental and transport geography. In this extensive review, we obtain 14,537 papers from the Web of Science in the relevant fields and select 1516 papers that we identify as human geography studies using GeoAI via human scanning conducted by several research groups around the world. We outline the GeoAI applications in human geography by systematically summarising the number of publications over the years, empirical studies across countries, the categories of data sources used in GeoAI applications, and their modelling tasks across different subdomains. We find out that existing human geography studies have limited capacity to monitor complex human behaviour and examine the non-linear relationship between human behaviour and its potential drivers—such limits can be overcome by GeoAI models with the capacity to handle complexity. We elaborate on the current progress and status of GeoAI applications within each subdomain of human geography, point out the issues and challenges, as well as propose the directions and research opportunities for using GeoAI in future human geography studies in the context of sustainable and open science, generative AI, and quantum revolution.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-jag-geoai-hg/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-jag-geoai-hg/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/page-one_hu_986ef5ac50808d80.webp 400w,
/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/page-one_hu_67ee168bd8acec17.webp 760w,
/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/page-one_hu_2308c8cb02eb7df8.webp 1200w"
src="https://ual.sg/post/2024/04/11/new-paper-mapping-the-landscape-and-roadmap-of-geospatial-artificial-intelligence-geoai-in-quantitative-human-geography/page-one_hu_986ef5ac50808d80.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_jag_geoai_hg&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wang, Siqin and Huang, Xiao and Liu, Pengyuan and Zhang, Mengxi and Biljecki, Filip and Hu, Tao and Fu, Xiaokang and Liu, Lingbo and Liu, Xintao and Wang, Ruomei and Huang, Yuanyuan and Yan, Jingjing and Jiang, Jinghan and Chukwu, Michaelmary and Reza Naghedi, Seyed and Hemmati, Moein and Shao, Yaxiong and Jia, Nan and Xiao, Zhiyang and Tian, Tian and Hu, Yaxin and Yu, Lixiaona and Yap, Winston and Macatulad, Edgardo and Chen, Zhuo and Cui, Yunhe and Ito, Koichi and Ye, Mengbi and Fan, Zicheng and Lei, Binyu and Bao, Shuming}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.jag.2024.103734}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1569-8432}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Applied Earth Observation and Geoinformation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{103734}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Mapping the landscape and roadmap of geospatial artificial intelligence (GeoAI) in quantitative human geography: An extensive systematic review}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{128}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Government officials from five European cities visit our group</title><link>https://ual.sg/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/</link><pubDate>Sat, 09 Mar 2024 13:39:19 +0800</pubDate><guid>https://ual.sg/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/</guid><description>&lt;p&gt;Our Lab hosted government officials of five European cities (Hamburg, Helsinki, Prague, Rotterdam &amp;amp; Vienna) for a visit to our research group to learn more about our efforts in advancing urban digital twins.
A big thank you to the Singapore Land Authority (SLA) who kindly facilitated the visit, as part of the 5 Cities Connect Plus (5CC+) Meeting in Singapore that is hosted by SLA.&lt;/p&gt;
&lt;p&gt;5CC+ is a community of local governments of the cities of Helsinki, Hamburg, Prague, Vienna and Rotterdam, and Singapore. 🇩🇪🇫🇮🇨🇿🇳🇱🇦🇹🇸🇬&lt;/p&gt;
&lt;p&gt;It also includes companies such as virtualcitySYSTEMS &amp;amp; Future Insight Group.
Its aim is to share knowledge and jointly address challenges related to the implementation of Digital Urban Twins.&lt;/p&gt;
&lt;p&gt;The guests have also toured our SDE4, the first newly built net-zero energy building in the country, in which our NUS Urban Analytics Lab is at.
Our sister labs and others contributed to the sharing session as well, presenting ongoing cohesive multi-scale &amp;amp; multidisciplinary efforts on digital twin research at our National University of Singapore. We thank our European guests for visiting us and wish them an enjoyable stay in Singapore and at SLA!&lt;/p&gt;
&lt;p&gt;Many thanks to the contributions by NUS Building Robotics Lab, BEAM initiative, BUDS Lab, Singapore-ETH Centre; and Marcel Ignatius, Matias Quintana, Binyu Lei, Ali Ghahramani, Clayton Miller, and Alakesh Dutta.
NUS Department of Architecture&lt;/p&gt;
&lt;p&gt;Looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/1_hu_d779a30f206fceef.webp 400w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/1_hu_5105ec58e1bed5c2.webp 760w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/1_hu_fec4acd0d2e32185.webp 1200w"
src="https://ual.sg/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/1_hu_d779a30f206fceef.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/2_hu_b8e7f9a8e25ce278.webp 400w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/2_hu_20b34f21ec7212ba.webp 760w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/2_hu_4014ccf2cd8313cc.webp 1200w"
src="https://ual.sg/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/2_hu_b8e7f9a8e25ce278.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/3_hu_da0893cb715d40b7.webp 400w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/3_hu_59e840181ab46eb4.webp 760w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/3_hu_481909cb433552ce.webp 1200w"
src="https://ual.sg/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/3_hu_da0893cb715d40b7.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/4_hu_52164f5701a494a0.webp 400w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/4_hu_d638fef79dafed30.webp 760w,
/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/4_hu_b77f832b7f1f628b.webp 1200w"
src="https://ual.sg/post/2024/03/09/government-officials-from-five-european-cities-visit-our-group/4_hu_52164f5701a494a0.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Microclimate spatio-temporal prediction using deep learning and land use data</title><link>https://ual.sg/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/</link><pubDate>Mon, 04 Mar 2024 10:23:16 +0800</pubDate><guid>https://ual.sg/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Han J, Chong A, Lim J, Ramasamy S, Wong NH, Biljecki F (2024): Microclimate spatio-temporal prediction using deep learning and land use data. &lt;em&gt;Building and Environment&lt;/em&gt; 253: 111358. &lt;a href="https://doi.org/10.1016/j.buildenv.2024.111358" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2024.111358&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-bae-microclimate-prediction/2024-bae-microclimate-prediction.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/jintong-han/"&gt;Jintong Han&lt;/a&gt;, who has been working on her PhD with &lt;a href="https://scholar.google.com.sg/citations?hl=en&amp;amp;user=Xm3qR2QAAAAJ" target="_blank" rel="noopener"&gt;Prof Adrian Chong&lt;/a&gt; from our sister group &lt;a href="https://ideaslab.io" target="_blank" rel="noopener"&gt;IDEAS Lab&lt;/a&gt;, and with whom we have been collaborating in the recent years.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The developed code has been released openly and it can be accessed &lt;a href="https://github.com/ideas-lab-nus/microclimate-dl-predict" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/1_hu_18c6678ddbcae4bd.webp 400w,
/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/1_hu_8f73e27660ecc18e.webp 760w,
/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/1_hu_d8f08b5de867dcb9.webp 1200w"
src="https://ual.sg/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/1_hu_18c6678ddbcae4bd.webp"
width="760"
height="329"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/c/1igmX1HudNFeZI" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-04-19.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Microclimates could exhibit noticeable variations within small spatial and temporal scales.&lt;/li&gt;
&lt;li&gt;Combining spatial and temporal knowledge contributes to the microclimate prediction accuracy.&lt;/li&gt;
&lt;li&gt;Integrating LULC data enhances the stability of prediction errors.&lt;/li&gt;
&lt;li&gt;Different LULC features have varying temporal effects on microclimate predictions.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban microclimate prediction is crucial for various fields, including Building Performance Simulation (BPS), outdoor thermal comfort, building life cycle, and residential health. Existing methods involve using classical weather file data, such as Typical Meteorological Years (TMY), or machine learning techniques for time-based forecasting. However, the incorporation of both spatial and temporal dimensions and land use/land cover (LULC) data is seldom considered. This paper proposes a novel approach to predict microclimate: the Geo-LSTM-Kriging model, which is applicable for fine-scale microclimate prediction within a few hundred meters around weather stations. The Geo-layer processes and learns from LULC data, the LSTM layer learns from historical data, and the Kriging layer extracts spatial distance information. This comprehensive combination integrates spatial, temporal, and environmental conditions, providing accurate results with higher spatial resolution (1 m x 1 m) and shorter time intervals (10 min). These prediction results were achieved by employing statistical downscaling calculation and utilizing data from 14 weather stations located within our university campus. Upon the analysis of these prediction results, we found that the proposed model can accurately predict temperature and humidity at high spatial and temporal resolution. Compared to traditional interpolation models, the RMSE of temperature decreases from 1.59 °C to 0.64 °C, and the RMSE of relative humidity (RH) decreases from 7.70 to 3.23. A thorough analysis of the model prediction results reveals the varied impacts of different LULC features on microclimate predictions, highlighting the value of the proposed model and the importance of incorporating LULC data.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-bae-microclimate-prediction/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-bae-microclimate-prediction/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/page-one_hu_5683975743639911.webp 400w,
/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/page-one_hu_7f2845cc0bc2e7c9.webp 760w,
/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/page-one_hu_3aeaabcf53eef904.webp 1200w"
src="https://ual.sg/post/2024/03/04/new-paper-microclimate-spatio-temporal-prediction-using-deep-learning-and-land-use-data/page-one_hu_5683975743639911.webp"
width="585"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_bae_microclimate_prediction&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Han, Jintong and Chong, Adrian and Lim, Joie and Ramasamy, Savitha and Wong, Nyuk Hien and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2024.111358}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{111358}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Microclimate spatio-temporal prediction using deep learning and land use data}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{253}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>The papers from 3D GeoInfo 2023 are published</title><link>https://ual.sg/post/2024/02/22/the-papers-from-3d-geoinfo-2023-are-published/</link><pubDate>Thu, 22 Feb 2024 06:29:43 +0800</pubDate><guid>https://ual.sg/post/2024/02/22/the-papers-from-3d-geoinfo-2023-are-published/</guid><description>&lt;p&gt;The &lt;a href="https://www.3dgeoinfo.org/3dgeoinfo/" target="_blank" rel="noopener"&gt;18th International 3D GeoInfo Conference 2023&lt;/a&gt; took place in Munich, Germany on 13-14 September 2023.
It was hosted by the &lt;a href="https://www.asg.ed.tum.de/en/gis/home/" target="_blank" rel="noopener"&gt;Chair of Geoinformatics at the Technical University of Munich&lt;/a&gt; led by &lt;a href="https://www.asg.ed.tum.de/en/gis/our-team/staff/prof-thomas-h-kolbe/" target="_blank" rel="noopener"&gt;Professor Thomas H. Kolbe&lt;/a&gt;.
We have &lt;a href="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/"&gt;a blog post&lt;/a&gt; about it.&lt;/p&gt;
&lt;p&gt;Now the papers from the conference have been published by Springer, as a book &lt;em&gt;Recent Advances in 3D Geoinformation Science&lt;/em&gt; (edited by Thomas H. Kolbe, Andreas Donaubauer, Christof Beil), which is part of the series: Lecture Notes in Geoinformation and Cartography (LNGC).
The Proceedings of the 18th 3D GeoInfo Conference, featuring 51 papers, is available &lt;a href="https://link.springer.com/book/10.1007/978-3-031-43699-4" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.
We thank the editors for their dedicated work and liaising with the publisher.&lt;/p&gt;
&lt;p&gt;Two papers of ours were published and presented at the conference:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Su Y, Biljecki F (2024): Humans As Sensors in Urban Digital Twins. In: Kolbe TH, Donaubauer A, Beil C (eds) Recent Advances in 3D Geoinformation Science, pp. 693-706. 3DGeoInfo 2023. Lecture Notes in Geoinformation and Cartography. Springer, Cham. &lt;a href="https://doi.org/10.1007/978-3-031-43699-4_42" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1007/978-3-031-43699-4_42&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-3-dgeoinfo-humans-sensors/2024-3-dgeoinfo-humans-sensors.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Van der Vaart J, Stoter J, Diakité A, Biljecki F, Arroyo Ohori K, Hakim A (2024): Assessment of the LoD Specification for the Integration of BIM-Derived Building Models in 3D City Models. In: Kolbe TH, Donaubauer A, Beil C (eds) Recent Advances in 3D Geoinformation Science. 3DGeoInfo 2023, pp. 171-191. Lecture Notes in Geoinformation and Cartography. Springer, Cham. &lt;a href="https://doi.org/10.1007/978-3-031-43699-4_11" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1007/978-3-031-43699-4_11&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-3-dgeoinfo-bim-lod/2024-3-dgeoinfo-bim-lod.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This first paper, led by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;, was awarded the best paper of the conference. 🏆&lt;/p&gt;
&lt;p&gt;See you at &lt;a href="https://3dgeoinfoeg-ice.webs.uvigo.es" target="_blank" rel="noopener"&gt;3D GeoInfo 2024&lt;/a&gt; in Vigo, Spain!&lt;/p&gt;
&lt;p&gt;BibTeX citations:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@inbook&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_3dgeoinfo_humans_sensors&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Su, Yunlei and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;booktitle&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Recent Advances in 3D Geoinformation Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1007/978-3-031-43699-4_42}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;isbn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9783031436994}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1863-2351}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{693--706}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;publisher&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Springer Nature Switzerland}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Humans As Sensors in Urban Digital Twins}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@inbook&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_3dgeoinfo_bim_lod&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{van der Vaart, Jasper and Stoter, Jantien and Diakit{\&amp;#39;e}, Abdoulaye and Biljecki, Filip and Arroyo Ohori, Ken and Hakim, Amir}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;booktitle&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Recent Advances in 3D Geoinformation Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1007/978-3-031-43699-4_11}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;isbn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9783031436994}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1863-2351}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{171--191}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;publisher&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Springer Nature Switzerland}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Assessment of the LoD Specification for the Integration of BIM-Derived Building Models in 3D City Models}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Continuing from the Sendai Framework midterm: Opportunities for urban digital twins in disaster risk management</title><link>https://ual.sg/post/2024/02/07/new-paper-continuing-from-the-sendai-framework-midterm-opportunities-for-urban-digital-twins-in-disaster-risk-management/</link><pubDate>Wed, 07 Feb 2024 08:09:22 +0800</pubDate><guid>https://ual.sg/post/2024/02/07/new-paper-continuing-from-the-sendai-framework-midterm-opportunities-for-urban-digital-twins-in-disaster-risk-management/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Macatulad E, Biljecki F (2024): Continuing from the Sendai Framework midterm: Opportunities for urban digital twins in disaster risk management. International Journal of Disaster Risk Reduction, 102: 104310. &lt;a href="https://doi.org/10.1016/j.ijdrr.2024.104310" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.ijdrr.2024.104310&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-ijdrr-sendai/2024-ijdrr-sendai.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/edgardo-g.-macatulad/"&gt;Edgardo G. Macatulad&lt;/a&gt;.
Congratulations on this journal publication that is part of his PhD research! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1iYqH7t2zZHCdK" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-03-27.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A review of urban disaster risk management research in the Sendai Framework context.&lt;/li&gt;
&lt;li&gt;Revealing challenges due to multi-faceted characteristic of disaster risk management.&lt;/li&gt;
&lt;li&gt;Identifying research directions in urban digital twins for disaster risk management.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban areas and cities face risks caused by the compounding impacts of urbanization and increasing frequency of disasters. The importance of implementing disaster risk management integrated with strategies to achieve sustainable urban development is highlighted by the United Nations Office for Disaster Risk Reduction (UNDRR) through the Sendai Framework for Disaster Risk Reduction 2015–2030 (SFDRR), which provides guidelines for monitoring and reporting the implementation of disaster risk reduction strategies towards resilience and sustainability. In this paper, we present a systematic review of the studies on urban disaster risk management since the Sendai Framework’s adoption in 2015 until 2022—at its midterm—identifying implementation challenges that urban digital twins can possibly address. Our study involved two stages. First, a scoping review looked at the profile of journal articles and their research trends on the topic. Second, 141 publications were selected for full-text review and synthesis within the context of the Sendai Framework priorities of action. In these studies, research on urban resilience has gained increased attention, but the importance of risk assessment is still highlighted as one of the critical process of disaster risk management. Overall, the reviewed studies reveal the complexity of disaster risk and management—requiring research considerations in different facets of: multi-dimension, multi-scale, multi-stakeholder, multi-hazard, and multi-perspective. Research directions show opportunities for urban digital twins in disaster risk management—particularly, as the integrating framework and platform of urban systems and disaster risk management processes.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-ijdrr-sendai/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-ijdrr-sendai/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/02/07/new-paper-continuing-from-the-sendai-framework-midterm-opportunities-for-urban-digital-twins-in-disaster-risk-management/page-one_hu_dce20bb38fdf3434.webp 400w,
/post/2024/02/07/new-paper-continuing-from-the-sendai-framework-midterm-opportunities-for-urban-digital-twins-in-disaster-risk-management/page-one_hu_993d04093d7e4608.webp 760w,
/post/2024/02/07/new-paper-continuing-from-the-sendai-framework-midterm-opportunities-for-urban-digital-twins-in-disaster-risk-management/page-one_hu_bb5222ae754efad0.webp 1200w"
src="https://ual.sg/post/2024/02/07/new-paper-continuing-from-the-sendai-framework-midterm-opportunities-for-urban-digital-twins-in-disaster-risk-management/page-one_hu_dce20bb38fdf3434.webp"
width="566"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_ijdrr_sendai&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Macatulad, Edgardo and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.ijdrr.2024.104310}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Disaster Risk Reduction}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104310}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Continuing from the Sendai Framework midterm: Opportunities for urban digital twins in disaster risk management}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Learning visual features from figure-ground maps for urban morphology discovery</title><link>https://ual.sg/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/</link><pubDate>Sun, 04 Feb 2024 20:05:22 +0800</pubDate><guid>https://ual.sg/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wang J, Huang W, Biljecki F (2024): Learning visual features from figure-ground maps for urban morphology discovery. Computers, Environment and Urban Systems, 109: 102076. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2024.102076" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2024.102076&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-ceus-urban-form-discovery/2024-ceus-urban-form-discovery.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/jing-wang/"&gt;Jing Wang&lt;/a&gt;, our Master of Urban Planning graduate.
Congratulations on this important journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/1_hu_b1c95c350b245bc3.webp 400w,
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/1_hu_7c98d9c736afbee5.webp 760w,
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/1_hu_7dfa75f69553780a.webp 1200w"
src="https://ual.sg/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/1_hu_b1c95c350b245bc3.webp"
width="760"
height="391"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1iXjH_4XYgisg5" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-03-24.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Simulating visual interpretation of urban figure-ground maps with machine eyes&lt;/li&gt;
&lt;li&gt;Building on visual representation learning while integrating morphological indicators&lt;/li&gt;
&lt;li&gt;A fully unsupervised approach for learning morphological features, encompassing the spatial layout&lt;/li&gt;
&lt;li&gt;Inventories of typical urban patterns in four cities globally&lt;/li&gt;
&lt;li&gt;Inner- and cross-city comparison of morphological homogeneity&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Most studies of urban morphology rely on morphometrics, such as building area and street length. However, these methods often fall short in capturing visual patterns that carry abundant information about the configuration of urban elements and how they interact spatially. In this study, we introduce a novel method for learning morphology features based on figure-ground maps, which leverages recent developments in computer vision. Our method facilitates discovering and comparing urban form types in a fully unsupervised manner. Specifically, we examine building fabrics by 1 km patches. A visual representation learning model (SimCLR) casts each patch into a latent embedding space where similar patches are clustered while dissimilar patches are dispelled, thus generating morphology representations that entail the layout of building groups. The learned morphology features are tested in urban form typology clustering and comparison tasks in four diverse cities: Singapore, San Francisco, Barcelona, and Amsterdam, with data sourced from OpenStreetMap. Clustering results show effective identification of typical urban morphology types corresponding to urban functions and historical developments. Further analyses based on the representations reveal inner- and cross-city morphological homogeneity relating to socio-economic drivers. We conclude that this method is a promising alternative for effectively describing urban patterns in morphology analysis.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/2_hu_236ed300a0e98f1d.webp 400w,
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/2_hu_efc81a0d1bf9cbda.webp 760w,
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/2_hu_8cea6f151adaca9f.webp 1200w"
src="https://ual.sg/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/2_hu_236ed300a0e98f1d.webp"
width="679"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-ceus-urban-form-discovery/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-ceus-urban-form-discovery/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/page-one_hu_f0175670f9743f92.webp 400w,
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/page-one_hu_ed5222c02a28c548.webp 760w,
/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/page-one_hu_e79f07fe9ced9c4d.webp 1200w"
src="https://ual.sg/post/2024/02/04/new-paper-learning-visual-features-from-figure-ground-maps-for-urban-morphology-discovery/page-one_hu_f0175670f9743f92.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_ceus_urban_form_discovery&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wang, Jing and Huang, Weiming and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2024.102076}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102076}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Learning visual features from figure-ground maps for urban morphology discovery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{109}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Crowdsourcing Geospatial Data for Earth and Human Observations: A Review</title><link>https://ual.sg/post/2024/01/23/new-paper-crowdsourcing-geospatial-data-for-earth-and-human-observations-a-review/</link><pubDate>Tue, 23 Jan 2024 14:08:13 +0800</pubDate><guid>https://ual.sg/post/2024/01/23/new-paper-crowdsourcing-geospatial-data-for-earth-and-human-observations-a-review/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Huang X, Wang S, Yang D, Hu T, Chen M, Zhang M, Zhang G, Biljecki F, Lu T, Zou L, Wu CHY, Park YM, Li X, Liu Y, Fan H, Mitchell J, Li Z, Hohl A (2024): Crowdsourcing Geospatial Data for Earth and Human Observations: A Review. &lt;em&gt;Journal of Remote Sensing&lt;/em&gt; 4: 0105. &lt;a href="https://doi.org/10.34133/remotesensing.0105" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41597-023-02749-0&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-jrs-crowdsourcing/2024-jrs-crowdsourcing.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The article covers a large range of data types and provenances, reveals challenges, and outlines future directions, together with a few other topics.&lt;/p&gt;
&lt;p&gt;The review paper was led by &lt;a href="https://envs.emory.edu/people/bios/Huang-Xiao%20.html" target="_blank" rel="noopener"&gt;Xiao Huang&lt;/a&gt; from Emory University.&lt;/p&gt;
&lt;p&gt;It was put together by authors from 18 university departments around the world (USA, UK, Singapore, and Norway): Xiao Huang (Emory University), Siqin Wang (University of Southern California), Di Yang (University of Wyoming), Tao Hu (Oklahoma State University), Meixu Chen (University of Liverpool), Mengxi Zhang (Virginia Tech), Guiming Zhang (University of Denver), Filip Biljecki (National University of Singapore), Tianjun Lu (University of Kentucky), Lei Zou (Texas A&amp;amp;M University), Connor Y.H. Wu (Oklahoma State University), Yoo Min Park (University of Connecticut), Xiao Li (University of Oxford), Yunzhe Liu (Imperial College London), Hongchao Fan (Norwegian University of Science and Technology), Jessica Mitchell (University of Montana), Zhenlong Li (The Pennsylvania State University), and Alexander Hohl (The University of Utah).&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The transformation from authoritative to user-generated data landscapes has garnered considerable attention, notably with the proliferation of crowdsourced geospatial data. Facilitated by advancements in digital technology and high-speed communication, this paradigm shift has democratized data collection, obliterating traditional barriers between data producers and users. While previous literature has compartmentalized this subject into distinct platforms and application domains, this review offers a holistic examination of crowdsourced geospatial data. Employing a narrative review approach due to the interdisciplinary nature of the topic, we investigate both human and Earth observations through crowdsourced initiatives. This review categorizes the diverse applications of these data and rigorously examines specific platforms and paradigms pertinent to data collection. Furthermore, it addresses salient challenges, encompassing data quality, inherent biases, and ethical dimensions. We contend that this thorough analysis will serve as an invaluable scholarly resource, encapsulating the current state-of-the-art in crowdsourced geospatial data, and offering strategic directions for future interdisciplinary research and applications across various sectors.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-jrs-crowdsourcing/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-jrs-crowdsourcing/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2024/01/23/new-paper-crowdsourcing-geospatial-data-for-earth-and-human-observations-a-review/page-one_hu_46631c1f27d180ee.webp 400w,
/post/2024/01/23/new-paper-crowdsourcing-geospatial-data-for-earth-and-human-observations-a-review/page-one_hu_6eda803badcfb6dc.webp 760w,
/post/2024/01/23/new-paper-crowdsourcing-geospatial-data-for-earth-and-human-observations-a-review/page-one_hu_cdcffe513669dc62.webp 1200w"
src="https://ual.sg/post/2024/01/23/new-paper-crowdsourcing-geospatial-data-for-earth-and-human-observations-a-review/page-one_hu_46631c1f27d180ee.webp"
width="707"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_jrs_crowdsourcing&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Huang, Xiao and Wang, Siqin and Yang, Di and Hu, Tao and Chen, Meixu and Zhang, Mengxi and Zhang, Guiming and Biljecki, Filip and Lu, Tianjun and Zou, Lei and Wu, Connor Y. H. and Park, Yoo Min and Li, Xiao and Liu, Yunzhe and Fan, Hongchao and Mitchell, Jessica and Li, Zhenlong and Hohl, Alexander}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.34133/remotesensing.0105}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Journal of Remote Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Crowdsourcing Geospatial Data for Earth and Human Observations: A Review}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{4}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{0105}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Happy Holidays from our Lab</title><link>https://ual.sg/post/2023/12/23/happy-holidays-from-our-lab/</link><pubDate>Sat, 23 Dec 2023 11:18:29 +0800</pubDate><guid>https://ual.sg/post/2023/12/23/happy-holidays-from-our-lab/</guid><description>&lt;p&gt;Happy Holidays from the NUS Urban Analytics Lab &amp;amp; collaborators! 🏝️🥂
For those who celebrate - Merry Christmas! 🎄&lt;/p&gt;
&lt;p&gt;We are proud of what our small research group has achieved this year &amp;ndash; we &lt;a href="https://ual.sg/data-code"&gt;released a few open-source software packages and open datasets&lt;/a&gt;, &lt;a href="https://ual.sg/post"&gt;gave 40 invited talks in 12 countries&lt;/a&gt; (including &lt;a href="https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/"&gt;many by our junior researchers&lt;/a&gt;), received a &lt;a href="https://ual.sg/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/"&gt;couple&lt;/a&gt; of &lt;a href="https://ual.sg/post/2023/02/27/cde-awards-and-recognition-2023/"&gt;awards&lt;/a&gt;, witnessed &lt;a href="https://ual.sg/post/2023/02/26/congratulations-to-dr-pengyuan-liu-on-a-faculty-position/"&gt;the first placements of Lab alumni as faculty elsewhere&lt;/a&gt;, and had lots of enjoyable collaborations and fun.
With ample exciting research, we wrote some papers as well &amp;ndash; we &lt;a href="https://ual.sg/publication"&gt;published&lt;/a&gt; more than 20 of them.
For the first time, we started a couple of &lt;a href="https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/"&gt;industrial collaborations&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;As part of our community engagement, department service, and strengthening our network, we hosted &lt;a href="https://ual.sg/seminars"&gt;several visiting scholars and guest lecturers from all over the world&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;This year also marks the first time that &lt;a href="https://ual.sg/post/2023/07/03/our-university-is-now-ranked-top-10-worldwide/"&gt;our National University of Singapore ascended to be ranked within the top 10 universities worldwide&lt;/a&gt; according to QS (#8).&lt;/p&gt;
&lt;p&gt;We look forward to continue our work in 2024, with new advancements serving data-driven urban planning and expand our collaborations.&lt;/p&gt;
&lt;p&gt;To read more about our work and agenda, visit our &lt;a href="https://ual.sg/about"&gt;About page&lt;/a&gt;, and follow &lt;a href="https://www.linkedin.com/company/urban-analytics-lab/" target="_blank" rel="noopener"&gt;our LinkedIn page&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Wishing everyone a Happy New Year!&lt;/p&gt;</description></item><item><title>FOSS4G Asia and research visits in Seoul</title><link>https://ual.sg/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/</link><pubDate>Fri, 22 Dec 2023 16:26:29 +0800</pubDate><guid>https://ual.sg/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/</guid><description>&lt;p&gt;The latest software development at the Urban Analytics Lab was recently featured at the FOSS4G Asia Conference in Seoul, South Korea 🇰🇷 by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;, PhD candidate at our research group.&lt;/p&gt;
&lt;p&gt;Supported by the FOSS4G Travel Grant Program, Winston presented his latest work on &lt;a href="https://github.com/winstonyym/urbanity" target="_blank" rel="noopener"&gt;Urbanity: A global tool for open urban network analysis&lt;/a&gt;, as part of the Urban Proximity - UN Habitat UTC session, which was co-organised by The Seoul Institute and Korea Planning Association.&lt;/p&gt;
&lt;p&gt;The presentation builds on two open access papers published in his PhD:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Biljecki F (2023): A Global Feature-Rich Network Dataset of Cities and Dashboard for Comprehensive Urban Analyses. Scientific Data 10: 667. &lt;a href="https://doi.org/10.1038/s41597-023-02578-1" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41597-023-02578-1&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-sd-urbanitydata/2023-sd-urbanitydata.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Stouffs R, Biljecki F (2023): Urbanity: automated modelling and analysis of multidimensional networks in cities. npj Urban Sustainability 3: 45. &lt;a href="https://doi.org/10.1038/s42949-023-00125-w" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s42949-023-00125-w&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-npjus-urbanity/2023-npjus-urbanity.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;These papers underscore the our Lab&amp;rsquo;s commitment to open data, software, and reproducibility in research.
During his visit, Winston also visited the labs of Professor Myung-Jin Jun and Asst. Professor Yujin Park at the &lt;a href="http://planning.cau.ac.kr/" target="_blank" rel="noopener"&gt;Department of Urban Planning and Real Estate&lt;/a&gt;, &lt;a href="https://www.cau.ac.kr/" target="_blank" rel="noopener"&gt;Chung-Ang University&lt;/a&gt;.
Our group looks forward to continue collaborating with these wonderful research groups and being a part of FOSS4G.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/1_hu_99895aab42fc035a.webp 400w,
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/1_hu_ed0fef5a6753260e.webp 760w,
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/1_hu_2dd53ebd4a754762.webp 1200w"
src="https://ual.sg/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/1_hu_99895aab42fc035a.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/2_hu_e59b94a4ab2a6707.webp 400w,
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/2_hu_1040c87f4622d6a8.webp 760w,
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/2_hu_a0be00c3dc70aba9.webp 1200w"
src="https://ual.sg/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/2_hu_e59b94a4ab2a6707.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/3_hu_2523e4d18897389c.webp 400w,
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/3_hu_8df77526b9a7011a.webp 760w,
/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/3_hu_99ce8ae2571e4afe.webp 1200w"
src="https://ual.sg/post/2023/12/22/foss4g-asia-and-research-visits-in-seoul/3_hu_2523e4d18897389c.webp"
width="596"
height="721"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Visit by Prof Song Gao from the University of Wisconsin - Madison</title><link>https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/</link><pubDate>Fri, 22 Dec 2023 10:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/</guid><description>&lt;p&gt;Our Lab hosted Dr &lt;a href="https://geography.wisc.edu/staff/gao-song/" target="_blank" rel="noopener"&gt;Song Gao&lt;/a&gt;, Associate Professor at the &lt;a href="https://www.wisc.edu" target="_blank" rel="noopener"&gt;University of Wisconsin - Madison&lt;/a&gt;, where he leads the &lt;a href="https://geography.wisc.edu/geods/research" target="_blank" rel="noopener"&gt;Geospatial Data Science Lab&lt;/a&gt;. 🇺🇸&lt;/p&gt;
&lt;p&gt;Dr Song Gao is an Associate Professor with tenure in Geographic Information Science at the University of Wisconsin-Madison, where he leads the Geospatial Data Science Lab. His main research interests include GeoAI and Human Mobility. He is the (co-)author of over 100 peer-reviewed research articles, published in prominent journals such as PNAS, IJGIS, Landscape and Urban Planning, and Annals of AAG, with 7800+ Google Scholar citations. He is the PI of multiple research grants from the National Science Foundation and industry partners. He currently serves as the Associate Editor of IJGIS, the Chair of the AAG Specialty Group in GIS, the Communications Director of UCGIS, and the BOD Chair of CPGIS. He was the recipient of the Waldo Tobler Young Researcher Award in GIScience, UCGIS Early/Mid-Career Research Award, AAG Spatial Analysis &amp;amp; Modeling Emerging Scholar Award, and among the Web of Science Top 1% global highly cited researchers list.&lt;/p&gt;
&lt;p&gt;During his stay, besides several collaborative exchanges such as discussion sessions and meetings, Song delivered the lecture &lt;em&gt;Opportunities and Challenges in Geospatial AI&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/1_hu_584f2029564ca66a.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/1_hu_2c9a288968bac82.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/1_hu_13b8d230904bf9e.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/1_hu_584f2029564ca66a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/2_hu_1fa383b405b5df92.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/2_hu_9e3eb97040eda175.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/2_hu_cc02b23f4b85a852.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/2_hu_1fa383b405b5df92.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/3_hu_90c000966a29a4bd.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/3_hu_469c23f69ab53d7e.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/3_hu_f466d8dfa8d22f92.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/3_hu_90c000966a29a4bd.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/4_hu_5d9daf9d1ca5f741.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/4_hu_db27a7970d2abed1.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/4_hu_620a71765a7c01d7.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/4_hu_5d9daf9d1ca5f741.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/5_hu_841a7d854937341d.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/5_hu_feb2759e9cd523f2.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/5_hu_47bd629879daa317.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/5_hu_841a7d854937341d.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/6_hu_76baac80d857c43e.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/6_hu_6dda2ffe0eb3d93d.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/6_hu_d817f4fdd609faf0.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/6_hu_76baac80d857c43e.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/7_hu_44f700d61815c327.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/7_hu_f42e285618553bc3.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/7_hu_990f1c41cf4dda57.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/7_hu_44f700d61815c327.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/8_hu_e3524ddcb92017ff.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/8_hu_d49ccb1a575de51.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/8_hu_7b6a149c3ba3d06f.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/8_hu_e3524ddcb92017ff.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/9_hu_5a9bcdcd319cdeab.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/9_hu_b0fbdeb5b17712ac.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/9_hu_328ce871adcbd6d3.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/9_hu_5a9bcdcd319cdeab.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/poster_hu_d0244d74f98bb46d.webp 400w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/poster_hu_6bf52e9955536e20.webp 760w,
/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/poster_hu_6e27f6fc4a48cbb0.webp 1200w"
src="https://ual.sg/post/2023/12/22/visit-by-prof-song-gao-from-the-university-of-wisconsin-madison/poster_hu_d0244d74f98bb46d.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Geospatial artificial intelligence (GeoAI), an emerging interdisciplinary field, merges geographic knowledge with AI techniques to address significant scientific and engineering challenges in human-environmental systems. It focuses on enhancing machines’ spatial intelligence to improve dynamic perception, intelligent reasoning, knowledge discovery and mapping of geographic phenomena. This talk will introduce the historical roots of GeoAI, delve into the latest advancements in spatially explicit AI models, and examine innovative research and applications of GeoAI, as well as addressing some challenges associated with GeoAI foundation models.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>The GeoAI handbook is out and we are proudly part of it</title><link>https://ual.sg/post/2023/12/17/the-geoai-handbook-is-out-and-we-are-proudly-part-of-it/</link><pubDate>Sun, 17 Dec 2023 06:19:22 +0800</pubDate><guid>https://ual.sg/post/2023/12/17/the-geoai-handbook-is-out-and-we-are-proudly-part-of-it/</guid><description>&lt;p&gt;The &lt;a href="https://www.taylorfrancis.com/books/edit/10.1201/9781003308423/handbook-geospatial-artificial-intelligence-song-gao-yingjie-hu-wenwen-li" target="_blank" rel="noopener"&gt;Handbook of Geospatial Artificial Intelligence&lt;/a&gt; has been published!&lt;/p&gt;
&lt;p&gt;The official description summarises it well:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;This comprehensive handbook covers Geospatial Artificial Intelligence (GeoAI), which is the integration of geospatial studies and AI machine (deep) learning and knowledge graph technologies. It explains key fundamental concepts, methods, models, and technologies of GeoAI, and discusses the recent advances, research tools, and applications that range from environmental observation and social sensing to natural disaster responses. As the first single volume on this fast-emerging domain, Handbook of Geospatial Artificial Intelligence is an excellent resource for educators, students, researchers, and practitioners utilizing GeoAI in fields such as information science, environment and natural resources, geosciences, and geography.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The book was organised and edited by &lt;a href="https://geography.wisc.edu/staff/gao-song/" target="_blank" rel="noopener"&gt;Song Gao&lt;/a&gt; (UW Madison), &lt;a href="https://www.buffalo.edu/cas/geography/faculty/faculty_directory/yingjie-hu.html" target="_blank" rel="noopener"&gt;Yingjie Hu&lt;/a&gt; (University at Buffalo), and &lt;a href="https://search.asu.edu/profile/1978357" target="_blank" rel="noopener"&gt;Wenwen Li&lt;/a&gt; (Arizona State University).&lt;/p&gt;
&lt;p&gt;The book packs a lot of interesting content and covers a variety of subjects pertaining to GeoAI &amp;ndash; it features 22 chapters with each describing a particular topic in GeoAI.
Most of the leading research groups in the field globally are represented in this book, and our contribution is Chapter 17: GeoAI for Urban Sensing, written by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The chapter describes a general introduction to the topic, challenges and opportunities, and it gives examples of research conducted at our Lab.
It is the only contribution from a research group from Asia.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F (2023): GeoAI for Urban Sensing. In Gao S, Hu Y, Li W (editors) Handbook of Geospatial Artificial Intelligence. CRC Press, pp. 351-366. &lt;a href="https://doi.org/10.1201/9781003308423-17" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1201/9781003308423-17&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-geoai-handbook-urban-sensing/2023-geoai-handbook-urban-sensing.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban sensing has been an important topic in the past decades, and research has been amplified in the last several years with the emergence of new urban data sources and advancements in GeoAI. This chapter gives a high-level overview of the applications of GeoAI for urban sensing, which have multiplied across various domains. It reviews four examples of GeoAI applied for urban sensing, which span a variety of data sources, techniques developed, and application domains such as urban sustainability. Concluding this topic, several challenges and opportunities for future research are discussed, such as ethics and data quality.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-geoai-handbook-urban-sensing/"&gt;paper&lt;/a&gt; or have a look at the &lt;a href="https://www.taylorfrancis.com/books/edit/10.1201/9781003308423/handbook-geospatial-artificial-intelligence-song-gao-yingjie-hu-wenwen-li" target="_blank" rel="noopener"&gt;entire book&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-geoai-handbook-urban-sensing/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/17/the-geoai-handbook-is-out-and-we-are-proudly-part-of-it/page-one_hu_192223b7264f8819.webp 400w,
/post/2023/12/17/the-geoai-handbook-is-out-and-we-are-proudly-part-of-it/page-one_hu_b4374a5497358d35.webp 760w,
/post/2023/12/17/the-geoai-handbook-is-out-and-we-are-proudly-part-of-it/page-one_hu_23f9794ddc631be0.webp 1200w"
src="https://ual.sg/post/2023/12/17/the-geoai-handbook-is-out-and-we-are-proudly-part-of-it/page-one_hu_192223b7264f8819.webp"
width="509"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@inbook&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_geoai_handbook_urban_sensing&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;booktitle&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Handbook of Geospatial Artificial Intelligence}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;chapter&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{17}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1201/9781003308423-17}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;editor&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Song Gao and Yingjie Hu and Wenwen Li}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;isbn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9781003308423}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{351--366}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;publisher&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{CRC Press}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{GeoAI for Urban Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>AGU 2023 and research visits in the United States</title><link>https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/</link><pubDate>Sat, 16 Dec 2023 11:21:29 +0800</pubDate><guid>https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/</guid><description>&lt;p&gt;The PI of the Urban Analytics Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, has participated and contributed to the &lt;a href="https://www.agu.org/fall-meeting" target="_blank" rel="noopener"&gt;2023 AGU Annual Meeting&lt;/a&gt; in San Francisco, California, USA 🇺🇸.
This is the flagship event of &lt;a href="https://www.agu.org" target="_blank" rel="noopener"&gt;The American Geophysical Union (AGU)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;He gave an invited presentation &lt;em&gt;Quality and Usability of Crowdsourced Global Building Information&lt;/em&gt; in the session &lt;em&gt;Detailed Guesstimates: The Art of Sampling, Simplifying, and Scaling a Feasible Distribution of Global Building Assets&lt;/em&gt;, which was organised by the &lt;a href="https://www.ornl.gov" target="_blank" rel="noopener"&gt;Oak Ridge National Laboratory&lt;/a&gt;.
In the presentation, Filip overviewed a couple of Lab papers focused on geospatial data on buildings, such as:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Chow YS, Lee K (2023): Quality of crowdsourced geospatial building information: A global assessment of OpenStreetMap attributes. &lt;em&gt;Building and Environment&lt;/em&gt; 237: 110295. &lt;a href="https://doi.org/10.1016/j.buildenv.2023.110295" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2023.110295&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-bae-osm-qa/2023-bae-osm-qa.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Chow YS (2022): Global Building Morphology Indicators. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 95: 101809.
&lt;a href="https://doi.org/10.1016/j.compenvurbsys.2022.101809" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt;10.1016/j.compenvurbsys.2022.101809&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ceus-gbmi/2022-ceus-gbmi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;These papers encompass some of the pillars of our research agenda such as spatial data quality assessment and enhancing the usability of crowdsourced geospatial data in the built environment.&lt;/p&gt;
&lt;p&gt;As part of this trip, Filip visited the following departments and sister labs, and key collaborators:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Georgia Institute of Technology, &lt;a href="https://planning.gatech.edu" target="_blank" rel="noopener"&gt;School of City and Regional Planning&lt;/a&gt;, &lt;a href="https://friendlycities.gatech.edu" target="_blank" rel="noopener"&gt;Friendly Cities Lab&lt;/a&gt; (&lt;a href="https://planning.gatech.edu/people/clio-andris" target="_blank" rel="noopener"&gt;Prof Clio Andris&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Emory University, &lt;a href="https://envs.emory.edu" target="_blank" rel="noopener"&gt;Department of Environmental Sciences&lt;/a&gt; (&lt;a href="https://envs.emory.edu/people/bios/Huang-Xiao%20.html" target="_blank" rel="noopener"&gt;Prof Xiao Huang&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;University of South Carolina, &lt;a href="http://gis.cas.sc.edu/cegis/" target="_blank" rel="noopener"&gt;Center for GIScience and Geospatial Big Data&lt;/a&gt; (&lt;a href="http://www.kkyyhh96.site/" target="_blank" rel="noopener"&gt;Prof Yuhao Kang&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;University of Southern California, &lt;a href="https://dornsife.usc.edu/spatial/" target="_blank" rel="noopener"&gt;Spatial Sciences Institute&lt;/a&gt; (&lt;a href="https://dornsife.usc.edu/spatial/profile/siqin-sisi-wang/" target="_blank" rel="noopener"&gt;Prof Sisi Wang&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;During the visit, he took part in the PhD committee of &lt;a href="https://ual.sg/author/xiaofan-liang/"&gt;Xiaofan Liang&lt;/a&gt;, who has defended her doctoral thesis &lt;em&gt;Connectivity for whom and at what cost: contesting network infrastructure duality in urban planning&lt;/em&gt;, and has conducted a part of her research in our NUS Urban Analytics Lab.
Big congrats, Dr Liang! She is moving to University of Michigan - Ann Arbor as faculty.
More about her work can be found at &lt;a href="https://www.xiaofanliang.com" target="_blank" rel="noopener"&gt;her personal webpage&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Many thanks to collaborators and hosts Clio, Yuhao, Sisi and Xiao, and everyone else for the great hospitality.
We look forward to continue collaborating with these wonderful research groups.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-guest-lecture-by-dr-filip-biljecki-at-georgia-tech"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Guest lecture by Dr Filip Biljecki at Georgia Tech" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-1_hu_ec1f9ad215407aac.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-1_hu_ac80baf0c1c562e3.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-1_hu_e236ff40d581fa3d.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-1_hu_ec1f9ad215407aac.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Guest lecture by Dr Filip Biljecki at Georgia Tech
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-phd-defence-by-xiaofan-liang-at-georgia-tech"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="PhD defence by Xiaofan Liang at Georgia Tech" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-2_hu_c52ff315db8c31f.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-2_hu_82eed640d9be36d5.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-2_hu_d015c35210849730.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-2_hu_c52ff315db8c31f.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
PhD defence by Xiaofan Liang at Georgia Tech
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-phd-defence-by-xiaofan-liang-at-georgia-tech"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="PhD defence by Xiaofan Liang at Georgia Tech" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-3_hu_8f0d7c6cf847df4b.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-3_hu_c99f362445daebaa.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-3_hu_cc8ce66b16ff8d53.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-3_hu_8f0d7c6cf847df4b.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
PhD defence by Xiaofan Liang at Georgia Tech
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-georgia-tech"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Georgia Tech" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-4_hu_6e452afb1c33b356.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-4_hu_4708f485e9303249.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-4_hu_13a965fb833dd535.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/gt-4_hu_6e452afb1c33b356.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Georgia Tech
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-emory-university"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Emory University" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/emory-1_hu_63a0c5cf91239982.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/emory-1_hu_6883747ba6d63e87.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/emory-1_hu_b58ed1400b0cc2f2.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/emory-1_hu_63a0c5cf91239982.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Emory University
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-south-carolina"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of South Carolina" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-1_hu_bfcee213d8f149ac.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-1_hu_b41fba22012879de.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-1_hu_c5ad532d28217cee.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-1_hu_bfcee213d8f149ac.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of South Carolina
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-south-carolina"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of South Carolina" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-2_hu_3e7a7f36f3005554.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-2_hu_4c0bf387ffb04d1f.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-2_hu_3402e77c93aecfc.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-sc-2_hu_3e7a7f36f3005554.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of South Carolina
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-southern-california"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of Southern California" srcset="
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-ca-1_hu_4499d521953480cc.webp 400w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-ca-1_hu_6232491aca96517c.webp 760w,
/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-ca-1_hu_bae6a14d4bbc8bf6.webp 1200w"
src="https://ual.sg/post/2023/12/16/agu-2023-and-research-visits-in-the-united-states/usc-ca-1_hu_4499d521953480cc.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of Southern California
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Assessing the Equity and Evolution of Urban Visual Perceptual Quality with Time Series Street View Imagery</title><link>https://ual.sg/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/</link><pubDate>Sun, 10 Dec 2023 18:23:22 +0800</pubDate><guid>https://ual.sg/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wang Z, Ito K, Biljecki F (2024): Assessing the equity and evolution of urban visual perceptual quality with time series street view imagery. Cities, 145: 104704. &lt;a href="https://doi.org/10.1016/j.cities.2023.104704" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2023.104704&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-cities-evolution/2024-cities-evolution.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/zeyu-wang/"&gt;Zeyu Wang&lt;/a&gt;, our Master of Urban Planning graduate.
Congratulations on the first journal publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/1_hu_f58ad573c03656ea.webp 400w,
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/1_hu_60165f40e17a2f80.webp 760w,
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/1_hu_31070ed47cca3d43.webp 1200w"
src="https://ual.sg/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/1_hu_f58ad573c03656ea.webp"
width="760"
height="368"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/2_hu_b5e343f14fdc65bc.webp 400w,
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/2_hu_6c9e9ebb47512b82.webp 760w,
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/2_hu_ab2f2ab43450e8ca.webp 1200w"
src="https://ual.sg/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/2_hu_b5e343f14fdc65bc.webp"
width="760"
height="349"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1iD8Yy5jOr6Ri" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2024-01-26.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The well-being of residents is considerably influenced by the quality of their environment. However, due to the lack of large-scale quantitative and longitudinal evaluation methods, it has been challenging to assess residents&amp;rsquo; satisfaction and achieve social inclusion goals in neighborhoods. We develop a novel cost-effective method that utilizes time series street view imagery for evaluating and monitoring visual environmental quality in neighborhoods. Unlike most research that relies on site visits or surveys, this study trains a deep learning model with a large-scale dataset to analyze six perception indicators&amp;rsquo; scores in neighborhoods in different geographies and does so longitudinally thanks to imagery taken over a period of a decade, a novelty in the body of knowledge. Implementing the approach, we examine public housing neighborhoods in Singapore and New York City as case studies. The results demonstrated that temporal imagery can effectively assess spatial equity and monitor the visual environmental qualities of neighborhoods over time, providing a new, comprehensive, and scalable workflow. It can help governments improve policies and make informed decisions on enhancing the design and living standards of urban residential areas, including public housing communities, which may be affected by social stigmatization, and monitor the effectiveness of their policies and actions.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-cities-evolution/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-cities-evolution/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/page-one_hu_a6ddedcbbb0112e8.webp 400w,
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/page-one_hu_5455a7c74c7eea7f.webp 760w,
/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/page-one_hu_b19c58aac7d3c2ae.webp 1200w"
src="https://ual.sg/post/2023/12/10/new-paper-assessing-the-equity-and-evolution-of-urban-visual-perceptual-quality-with-time-series-street-view-imagery/page-one_hu_a6ddedcbbb0112e8.webp"
width="578"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_cities_evolution&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wang, Zeyu and Ito, Koichi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2023.104704}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104704}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Assessing the equity and evolution of urban visual perceptual quality with time series street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{145}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper and dataset: District-scale surface temperatures generated from high-resolution longitudinal thermal infrared images</title><link>https://ual.sg/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/</link><pubDate>Mon, 04 Dec 2023 08:23:16 +0800</pubDate><guid>https://ual.sg/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lin S, Ramani V, Martin M, Arjunan P, Chong A, Biljecki F, Ignatius M, Poolla K, Miller C (2023): District-scale surface temperatures generated from high-resolution longitudinal thermal infrared images. &lt;em&gt;Scientific Data&lt;/em&gt; 10: 859. &lt;a href="https://doi.org/10.1038/s41597-023-02749-0" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41597-023-02749-0&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-sd-iris/2023-sd-iris.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This paper presents an openly released dataset collected from thermal observatories deployed in the campus of our National University of Singapore.&lt;/p&gt;
&lt;p&gt;Infrared thermography provides a non-contact technique to analyze the built environment in various aspects. While most studies focus on a city and building scale, rooftop observatorys provide high resolution with dynamical interactions on a neighborhood scale. The first rooftop thermal observatory with a multi-modal platform capable of assessing a wide range of dynamical processes in urban systems was deployed in Singapore. The thermal observatory was placed on the top of a building that overlooks several educational buildings on the campus of the National University of Singapore. The platform collects remote sensing data from tropical areas on a temporal scale, allowing users to determine the temperature trend of individual features such as buildings, roads, and vegetation. To manage and analyze the obtained raw data and allow scientific users to utilize the data as they saw fit, demonstration code with data preprocessing such as segmentation was provided.&lt;/p&gt;
&lt;p&gt;The project was spearheaded by &lt;a href="https://sg.linkedin.com/in/subin-lin-81710b211" target="_blank" rel="noopener"&gt;Subin Lin&lt;/a&gt; from NUS and the Berkeley Education Alliance for Research in Singapore (BEARS).&lt;/p&gt;
&lt;p&gt;The dataset can be accessed &lt;a href="https://github.com/buds-lab/project-iris-dataset" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/1_hu_98b87a012bff2c73.webp 400w,
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/1_hu_df67b596bc2f4cd6.webp 760w,
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/1_hu_248f6c231ee0033.webp 1200w"
src="https://ual.sg/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/1_hu_98b87a012bff2c73.webp"
width="760"
height="530"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/2_hu_d78bbbec2ee322a0.webp 400w,
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/2_hu_636ebe8c3b8d8869.webp 760w,
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/2_hu_73a0a6e8a531780d.webp 1200w"
src="https://ual.sg/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/2_hu_d78bbbec2ee322a0.webp"
width="760"
height="412"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;This paper describes a dataset collected by infrared thermography, a non-contact, non-intrusive technique to acquire data and analyze the built environment in various aspects. While most studies focus on the city and building scales, an observatory installed on a rooftop provides high temporal and spatial resolution observations with dynamic interactions on the district scale. The rooftop infrared thermography observatory with a multi-modal platform capable of assessing a wide range of dynamic processes in urban systems was deployed in Singapore. It was placed on the top of two buildings that overlook the outdoor context of the National University of Singapore campus. The platform collects remote sensing data from tropical areas on a temporal scale, allowing users to determine the temperature trend of individual features such as buildings, roads, and vegetation. The dataset includes 1,365,921 thermal images collected on average at approximately 10-second intervals from two locations during ten months.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-sd-iris/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-sd-iris/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/page-one_hu_ed626038995c51b1.webp 400w,
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/page-one_hu_45a71758866e8131.webp 760w,
/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/page-one_hu_7ffbec9ef364edc4.webp 1200w"
src="https://ual.sg/post/2023/12/04/new-paper-and-dataset-district-scale-surface-temperatures-generated-from-high-resolution-longitudinal-thermal-infrared-images/page-one_hu_ed626038995c51b1.webp"
width="576"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_sd_iris&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lin, Subin and Ramani, Vasantha and Martin, Miguel and Arjunan, Pandarasamy and Chong, Adrian and Biljecki, Filip and Ignatius, Marcel and Poolla, Kameshwar and Miller, Clayton}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s41597-023-02749-0}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Scientific Data}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{859}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{District-scale surface temperatures generated from high-resolution longitudinal thermal infrared images}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>The Building Data Genome Directory – An open, comprehensive data sharing platform for building performance research</title><link>https://ual.sg/post/2023/12/03/the-building-data-genome-directory-an-open-comprehensive-data-sharing-platform-for-building-performance-research/</link><pubDate>Sun, 03 Dec 2023 11:29:46 +0800</pubDate><guid>https://ual.sg/post/2023/12/03/the-building-data-genome-directory-an-open-comprehensive-data-sharing-platform-for-building-performance-research/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Jin X, Fu C, Kazmi H, Balint A, Canaydin A, Quintana M, Biljecki F, Xiao F, Miller C (2023): The Building Data Genome Directory – An open, comprehensive data sharing platform for building performance research. &lt;em&gt;Journal of Physics: Conference Series&lt;/em&gt; 2600: 032003. &lt;a href="https://doi.org/10.1088/1742-6596/2600/3/032003" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1088/1742-6596/2600/3/032003&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-cisbat-directory/2023-cisbat-directory.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This paper presents the Building Data Genome Directory, an open data-sharing platform serving as a one-stop shop for the data necessary for vital categories of building energy research.&lt;/p&gt;
&lt;p&gt;The project was spearheaded by &lt;a href="https://www.researchgate.net/profile/Xiaoyu-Jin-8" target="_blank" rel="noopener"&gt;Xiaoyu Jin&lt;/a&gt; from The Hong Kong Polytechnic University.&lt;/p&gt;
&lt;p&gt;The directory can be accessed &lt;a href="http://buildingdatadirectory.org/" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The building sector plays a crucial role in the worldwide decarbonization effort, accounting for significant portions of energy consumption and environmental effects. However, the scarcity of open data sources is a continuous challenge for built environment researchers and practitioners. Although several efforts have been made to consolidate existing open datasets, no database currently offers a comprehensive collection of building data types with all subcategories and time granularities (e.g., year, month, and sub-hour). This paper presents the Building Data Genome Directory, an open data-sharing platform serving as a one-stop shop for the data necessary for vital categories of building energy research. The data directory is an online portal (buildingdatadirectory.org/) that allows filtering and discovering valuable datasets. The directory covers meter, building-level, and aggregated community-level data at the spatial scale and year-to-minute level at the temporal scale. The datasets were consolidated from a comprehensive exploration of sources, including governments, research institutes, and online energy dashboards. The results of this effort include the aggregation of 60 datasets pertaining to building energy ontologies, building energy models, building energy and water data, electric vehicle data, weather data, building information data, text-mining-based research data, image data of buildings, fault detection diagnosis data and occupant data. A crowdsourcing mechanism in the platform allows users to submit datasets they suggest for inclusion by filling out an online form. This directory can fuel research and applications on building energy efficiency, which is an essential step toward addressing the world&amp;rsquo;s energy and environmental challenges.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-cisbat-directory/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_cisbat_directory&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Jin, Xiaoyu and Fu, Chun and Kazmi, Hussain and Balint, Atilla and Canaydin, Ada and Quintana, Matias and Biljecki, Filip and Xiao, Fu and Miller, Clayton}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1088/1742-6596/2600/3/032003}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Journal of Physics: Conference Series}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{3}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{032003}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{The Building Data Genome Directory -- An open, comprehensive data sharing platform for building performance research}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2600}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Keynote and papers at ACM BuildSys 2023, and introducing a Kaggle competition</title><link>https://ual.sg/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/</link><pubDate>Sun, 19 Nov 2023 21:29:43 +0800</pubDate><guid>https://ual.sg/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/</guid><description>&lt;p&gt;Our research group and our sibling labs &amp;ndash; &lt;a href="https://budslab.org" target="_blank" rel="noopener"&gt;BUDS Lab&lt;/a&gt; and &lt;a href="https://ideaslab.io" target="_blank" rel="noopener"&gt;IDEAS Lab&lt;/a&gt; &amp;ndash; have been active at this year&amp;rsquo;s &lt;a href="https://buildsys.acm.org/2023/" target="_blank" rel="noopener"&gt;ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation (BuildSys)&lt;/a&gt;, which took place in Istanbul, Turkey. 🇹🇷&lt;/p&gt;
&lt;p&gt;BuildSys is a highly selective, single-track forum for research on systems issues covering all aspects of the built environment&lt;/p&gt;
&lt;p&gt;The PI of the Lab &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; gave a keynote at the &lt;a href="https://www.flanigansaluslab.com/cpsis-2023" target="_blank" rel="noopener"&gt;1st Int&amp;rsquo;l Workshop on Cyber-Physical-Social Infrastructure Systems (CPSIS'23)&lt;/a&gt;, which was organised by &lt;a href="https://www.flanigansaluslab.com" target="_blank" rel="noopener"&gt;Prof Katherine Flanigan (Carnegie Mellon University)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Two papers were published and presented at the conference:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Miller C, Quintana M, Frei M, Chua YX, Fu C, Picchetti B, Yap W, Chong A, Biljecki F (2023): Introducing the Cool, Quiet City Competition: Predicting Smartwatch-Reported Heat and Noise with Digital Twin Metrics. Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation, pp. 298-299. &lt;a href="https://doi.org/10.1145/3600100.3626269" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1145/3600100.3626269&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-buildsys-cool-quiet/2023-buildsys-cool-quiet.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Ramani V, Ignatius M, Lim J, Biljecki F, Miller C (2023): A Dynamic Urban Digital Twin Integrating Longitudinal Thermal Imagery for Microclimate Studies. Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation, pp. 421-428. &lt;a href="https://doi.org/10.1145/3600100.3626345" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1145/3600100.3626345&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-buildsys-ir-dt/2023-buildsys-ir-dt.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The first paper introduces &lt;a href="https://www.kaggle.com/competitions/cool-quiet-city-competition" target="_blank" rel="noopener"&gt;&lt;em&gt;Cool, Quiet City Competition &amp;ndash; Predicting Smartwatch-Reported Heat and Noise with Digital Twin Metrics&lt;/em&gt;&lt;/a&gt;, a machine learning competition hosted by Kaggle that is launched for participants to compete in training models on the various contextual data to predict noise distraction and source as well as thermal preference across a diversity of spaces.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/1_hu_8c4dfee8a8fafba1.webp 400w,
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/1_hu_5e29941f2cc7b4dc.webp 760w,
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/1_hu_525f5b56efc4d2eb.webp 1200w"
src="https://ual.sg/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/1_hu_8c4dfee8a8fafba1.webp"
width="760"
height="350"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The second paper presents ongoing efforts on building a digital twin that integrates the longitudinal thermal envelope data of buildings on our campus with a virtual 3D model.
Thermal images of the buildings were captured using a neighborhood-scale infrared observatory for a few months.
Integrating these data sources in digital twins is a novelty, and we hope to make further advancements in this domain.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/2_hu_d75754305e4f06e7.webp 400w,
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/2_hu_7898bda8f59c1bdf.webp 760w,
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/2_hu_79a18aae1c261741.webp 1200w"
src="https://ual.sg/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/2_hu_d75754305e4f06e7.webp"
width="760"
height="403"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Lots of people were involved in these efforts and in the conference, e.g. &lt;a href="https://ual.sg/author/matias-quintana/"&gt;Matias Quintana&lt;/a&gt; was the poster chair of the conference.&lt;/p&gt;
&lt;p&gt;See you next year!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/3_hu_a4c4253ab7d5d1f3.webp 400w,
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/3_hu_bf9ce409a2a208d.webp 760w,
/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/3_hu_cd45d142f47a29c0.webp 1200w"
src="https://ual.sg/post/2023/11/19/keynote-and-papers-at-acm-buildsys-2023-and-introducing-a-kaggle-competition/3_hu_a4c4253ab7d5d1f3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;BibTeX citations:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_buildsys_cool_quiet&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Introducing the Cool, Quiet City Competition: Predicting Smartwatch-Reported Heat and Noise with Digital Twin Metrics}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Miller, Clayton and Quintana, Matias and Frei, Mario and Chua, Yun Xuan and Fu, Chun and Picchetti, Bianca and Yap, Winston and Chong, Adrian and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1145/3600100.3626269}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{298--299}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_buildsys_ir_dt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{A Dynamic Urban Digital Twin Integrating Longitudinal Thermal Imagery for Microclimate Studies}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ramani, Vasantha and Ignatius, Marcel and Lim, Joie and Biljecki, Filip and Miller, Clayton}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1145/3600100.3626345}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{421--428}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Invited conference talk and guest lecture in Korea</title><link>https://ual.sg/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/</link><pubDate>Sat, 18 Nov 2023 23:10:43 +0800</pubDate><guid>https://ual.sg/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/</guid><description>&lt;p&gt;The research of the NUS Urban Analytics Lab was presented in Korea by the PI of the Lab &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;. 🇰🇷&lt;/p&gt;
&lt;p&gt;First, he gave an invited talk at the &lt;a href="https://icgis2023.com" target="_blank" rel="noopener"&gt;2023 International Conference on Geospatial Information Science (ICGIS)&lt;/a&gt;, an event organised by the &lt;a href="https://www.krihs.re.kr/eng/" target="_blank" rel="noopener"&gt;Korea Research Institute for Human Settlements (KRIHS)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/1_hu_239d53ef0b890b60.webp 400w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/1_hu_766e802f8dc13d67.webp 760w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/1_hu_25fce2e98643e266.webp 1200w"
src="https://ual.sg/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/1_hu_239d53ef0b890b60.webp"
width="760"
height="352"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;ICGIS is an international academic conference held annually since 1996.
This year&amp;rsquo;s theme was &lt;em&gt;Geosimulation for Better Life&lt;/em&gt;, and it was part of the &lt;a href="http://smartgeoexpo.kr" target="_blank" rel="noopener"&gt;Smart Geo Expo 2023&lt;/a&gt;, an annual flagship event of the Korean geospatial community.&lt;/p&gt;
&lt;p&gt;The keynote speaker of the event was &lt;a href="https://pages.charlotte.edu/jean-claude-thill/" target="_blank" rel="noopener"&gt;Prof Jean-Claude Thill&lt;/a&gt; from the University of North Carolina at Charlotte.
The community owes him gratitude for his service as the former editor-in-chief of the journal &lt;a href="https://www.sciencedirect.com/journal/computers-environment-and-urban-systems" target="_blank" rel="noopener"&gt;Computers, Environment and Urban Systems&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The full programme is included below.&lt;/p&gt;
&lt;p&gt;We appreciate the organisation of the event and invitation.
Many thanks go especially to Dr Jae Soen Son (손재선) from the KRIHS&amp;rsquo;s Geospatial Analytics &amp;amp; Monitoring Center.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/2_hu_5ac171cf583b7604.webp 400w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/2_hu_85bb478ca7198053.webp 760w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/2_hu_bd23acae197b56b4.webp 1200w"
src="https://ual.sg/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/2_hu_5ac171cf583b7604.webp"
width="536"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Second, Filip gave a guest lecture at the &lt;a href="http://hyurban.hanyang.ac.kr" target="_blank" rel="noopener"&gt;Department of Urban Planning and Engineering at Hanyang University&lt;/a&gt;, thanks to the organisation of &lt;a href="http://junhwan89.cafe24.com/mainpeople/nueva-professor" target="_blank" rel="noopener"&gt;Prof Sugie Lee&lt;/a&gt; (이수기) and &lt;a href="https://scholar.google.com/citations?user=eaH__jsAAAAJ&amp;amp;hl=zh-CN" target="_blank" rel="noopener"&gt;Dr Li Na&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Hanyang University is a private research university in Seoul, and one of the most prestigious universities in the country.
Notably, it has one of the leading departments in urban planning in Korea.&lt;/p&gt;
&lt;p&gt;Prof Lee is the head of the department and heads the &lt;a href="http://udsal.hanyang.ac.kr" target="_blank" rel="noopener"&gt;Urban Design &amp;amp; Spatial Analysis Lab (UDSAL)&lt;/a&gt;, a prominent research group with which we hope to have more collaborations.&lt;/p&gt;
&lt;p&gt;Many thanks for the hospitality!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/3_hu_d148aa0e057f4caa.webp 400w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/3_hu_38af819fae6c60a8.webp 760w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/3_hu_19fe26c11684dd55.webp 1200w"
src="https://ual.sg/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/3_hu_d148aa0e057f4caa.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/4_hu_97720fef4f1e978b.webp 400w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/4_hu_42c8e2c388b8f27c.webp 760w,
/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/4_hu_86c27fc56e2166a.webp 1200w"
src="https://ual.sg/post/2023/11/18/invited-conference-talk-and-guest-lecture-in-korea/4_hu_97720fef4f1e978b.webp"
width="760"
height="571"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Keynote at the International Land Use Symposium 2023</title><link>https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/</link><pubDate>Sun, 22 Oct 2023 12:10:43 +0800</pubDate><guid>https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/</guid><description>&lt;p&gt;The &lt;a href="https://ilus2023.ioer.info/" target="_blank" rel="noopener"&gt;4th International Land Use Symposium (ILUS)&lt;/a&gt; on &amp;ldquo;Urban Analytics for Transforming Cities and Regions: Tools, Methods and Application&amp;rdquo; was held between 4 and 6 October 2023, in Ahmedabad, India. 🇮🇳&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/0_hu_a4a98483f976cd14.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/0_hu_d173504da520f33.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/0_hu_a16d7b3f1010d53d.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/0_hu_a4a98483f976cd14.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The event was hosted by &lt;a href="https://cept.ac.in/" target="_blank" rel="noopener"&gt;CEPT University&lt;/a&gt; and it was co-organised by the &lt;a href="https://www.ioer.de/en" target="_blank" rel="noopener"&gt;Leibniz Institute of Ecological Urban and Regional development (IOER)&lt;/a&gt; 🇩🇪.&lt;/p&gt;
&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the Lab with a keynote talk.&lt;/p&gt;
&lt;p&gt;We appreciate the organisation of the event and invitation.
In particular, many thanks to Shaily Gandhi, Mathias Jehling, and Martin Behnisch.
Appreciation goes also to the &lt;a href="https://crdf.org.in/" target="_blank" rel="noopener"&gt;CEPT Research and Development Foundation (CRDF)&lt;/a&gt;, &lt;a href="https://crdf.org.in/center/center-for-applied-geomatics" target="_blank" rel="noopener"&gt;Center for Applied Geomatics (CAG)&lt;/a&gt;, and &lt;a href="https://www.sac.gov.in/Vyom/overview" target="_blank" rel="noopener"&gt;Space Applications Centre (SAC), ISRO, Ahmedabad&lt;/a&gt; for their role in the organisation.&lt;/p&gt;
&lt;p&gt;The hospitality is very much appreciated, and we look forward to collaborating.&lt;/p&gt;
&lt;p&gt;Credit for most of the photos goes to CEPT University.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/1_hu_9c8142142e099b3.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/1_hu_ec3096206d090ff9.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/1_hu_618f31e7857af76b.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/1_hu_9c8142142e099b3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/2_hu_f41cc34b73673a24.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/2_hu_19051fe09fc2185f.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/2_hu_1068916115708ca4.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/2_hu_f41cc34b73673a24.webp"
width="507"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/3_hu_1bcfde87174b324e.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/3_hu_ff4589b56f2e490e.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/3_hu_4ed82a1cfd0e9099.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/3_hu_1bcfde87174b324e.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/4_hu_cfbd85e6aae810c6.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/4_hu_83eb95360d7b1c57.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/4_hu_9c5130e52455992b.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/4_hu_cfbd85e6aae810c6.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/5_hu_1f1953fc2c88a542.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/5_hu_b7b5a0c6706983f4.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/5_hu_21b2aa5074e040e1.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/5_hu_1f1953fc2c88a542.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/6_hu_93e0ded4be4ba18.webp 400w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/6_hu_c8f2e469e590ee56.webp 760w,
/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/6_hu_900dc30ba3aeacec.webp 1200w"
src="https://ual.sg/post/2023/10/22/keynote-at-the-international-land-use-symposium-2023/6_hu_93e0ded4be4ba18.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Our activities at the Future Cities Lab Global conference at ETH Zurich</title><link>https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/</link><pubDate>Wed, 18 Oct 2023 19:28:49 +0800</pubDate><guid>https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/</guid><description>&lt;p&gt;Our research group is managing the project &lt;a href="https://fcl.ethz.ch/research/integration-and-strategies/semantic-urban-elements.html" target="_blank" rel="noopener"&gt;Semantic Urban Elements&lt;/a&gt; at the &lt;a href="https://sec.ethz.ch" target="_blank" rel="noopener"&gt;Singapore-ETH Centre&lt;/a&gt; &amp;ndash; &lt;a href="https://fcl.ethz.ch" target="_blank" rel="noopener"&gt;Future Cities Lab Global&lt;/a&gt;, in collaboration with the &lt;a href="https://coss.ethz.ch" target="_blank" rel="noopener"&gt;ETH Zurich Computational Social Science group&lt;/a&gt; led by Professor &lt;a href="https://coss.ethz.ch/people/helbing.html" target="_blank" rel="noopener"&gt;Dirk Helbing&lt;/a&gt;.
We are quite active in cultivating this collaboration, e.g. &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, who is the Principal Investigator of the project, &lt;a href="https://ual.sg/post/2023/06/30/visits-to-switzerland-and-austria/"&gt;spent some time at ETH Zurich earlier this year&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;As part of the project and collaboration, we just attended the &lt;a href="https://fclg-ep.ethz.ch" target="_blank" rel="noopener"&gt;Conference and Exhibition of the Future Cities Laboratory Global&lt;/a&gt; hosted at ETH Zurich in Switzerland, to which we contributed in several ways.&lt;/p&gt;
&lt;p&gt;Almost the entire project team, based in Singapore and Zurich, attended it and contributed with presentations and discussions: &lt;a href="https://ual.sg/author/chenyi-cai/"&gt;Chenyi Cai&lt;/a&gt;, &lt;a href="https://ual.sg/author/matias-quintana/"&gt;Matias Quintana&lt;/a&gt;, Rohit Dubey, Javier Argota Sánchez-Vaquerizo, Pieter Herthogs, Aurel von Richthofen, Stefan Müller Arisona, Christoph Hoelscher, Dirk Helbing, and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0_hu_6c0de097db1df54e.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0_hu_b333bddc532e6242.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0_hu_3e1807c492067bd8.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0_hu_6c0de097db1df54e.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The two-day conference featured prominent keynote speakers and panel discussions, and an opportunity to interact directly with young researchers.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0a_hu_fc926ef5c5f32a7f.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0a_hu_e5bdc8a0f75545b1.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0a_hu_833e08225cc78799.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/0a_hu_fc926ef5c5f32a7f.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;A public exhibition showcased the results of the ongoing research activities.
Bringing together science, design, engineering and governance, FCL Global actively promotes the development of more sustainable, resilient and inclusive cities, setting the stage for a more liveable urban future.&lt;/p&gt;
&lt;p&gt;The event was attended by prominent scientists in our domain such as Professor Michael Batty.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/1_hu_3d080365d1918c8b.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/1_hu_4ac4a57da94000ba.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/1_hu_76ded94d0a750a5d.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/1_hu_3d080365d1918c8b.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/2_hu_af2730a569ced816.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/2_hu_5690fe1cb1d0b2bb.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/2_hu_4dbbb4f9a9403b40.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/2_hu_af2730a569ced816.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/3_hu_fa537a85adbc4e0a.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/3_hu_d53fd5217cd77cea.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/3_hu_7ce38723db303f5e.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/3_hu_fa537a85adbc4e0a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/4_hu_3acc77f44874750c.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/4_hu_3b4fed71153a4114.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/4_hu_c275108e1df75d6d.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/4_hu_3acc77f44874750c.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/5_hu_4cc86b662a3b7cdf.webp 400w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/5_hu_8d07c6a7cf0048d2.webp 760w,
/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/5_hu_6fc7bb7d6ad355e7.webp 1200w"
src="https://ual.sg/post/2023/10/18/our-activities-at-the-future-cities-lab-global-conference-at-eth-zurich/5_hu_4cc86b662a3b7cdf.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Urban Informatics Paper of the Year Award (2023)</title><link>https://ual.sg/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/</link><pubDate>Tue, 10 Oct 2023 13:40:19 +0800</pubDate><guid>https://ual.sg/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/</guid><description>&lt;p&gt;We are immensely proud that &lt;a href="https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/"&gt;the paper&lt;/a&gt; led by our master graduate &lt;a href="https://ual.sg/author/xinyu-chen/"&gt;Xinyu Chen&lt;/a&gt; has been &lt;a href="https://www.isocui.org/#/awards/icui2023" target="_blank" rel="noopener"&gt;awarded&lt;/a&gt; by the journal &lt;a href="https://link.springer.com/journal/44212" target="_blank" rel="noopener"&gt;Urban Informatics&lt;/a&gt; as the best paper of the year! 🏆&lt;/p&gt;
&lt;p&gt;The paper awarded with the &lt;em&gt;Urban Informatics Paper of the Year Award (2023)&lt;/em&gt; is:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen X, Biljecki F (2022): Mining real estate ads and property transactions for building and amenity data acquisition. &lt;em&gt;Urban Informatics&lt;/em&gt; 1: 12. &lt;a href="https://doi.org/10.1007/s44212-022-00012-2" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1007/s44212-022-00012-2&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ui-real-estate-mining/2022-ui-real-estate-mining.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Xinyu graduated with a masters degree from NUS and this paper is based on her graduation work that was supervised by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.
This is her first journal paper, marking a great start of her academic career.&lt;/p&gt;
&lt;p&gt;This paper is the first one to uncover the potential of real estate data for GIS purposes, and puts forward the idea of such data as an instance of user-generated / volunteered geographic information that has been ignored in the field.
More information about it can be found below.&lt;/p&gt;
&lt;p&gt;This recognition is also a testament to our dedication to growing our master students and integrating them in our research group.
Our Lab continues to support master students in their pursuit of excellence.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://link.springer.com/journal/44212" target="_blank" rel="noopener"&gt;Urban Informatics (UI)&lt;/a&gt; is an international, open-access, peer-reviewed journal of the &lt;a href="https://www.isocui.org/" target="_blank" rel="noopener"&gt;International Society for Urban Informatics (ISUI)&lt;/a&gt; and is published online by Springer.
The journal aims to introduce cutting-edge researches that leverage emerging technologies and data in the context of urban environments, tackle the relationships among people, place and technologies in cities, and advance the science of cities.&lt;/p&gt;
&lt;p&gt;The full list of awards is available &lt;a href="https://www.isocui.org/#/awards/icui2023" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.
Congratulations to others who won the 2nd and 3rd place. 🏅&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/1_hu_912d533b3cde4326.webp 400w,
/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/1_hu_d0b3594a39d6e665.webp 760w,
/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/1_hu_f89e09164ffd4e.webp 1200w"
src="https://ual.sg/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/1_hu_912d533b3cde4326.webp"
width="760"
height="225"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Thank you for this recognition and running this journal and the society.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Acquiring spatial data of fine and dynamic urban features such as buildings remains challenging. This paper brings attention to real estate advertisements and property sales data as valuable and dynamic sources of geoinformation in the built environment, but unutilised in spatial data infrastructures. Given the wealth of information they hold and their user-generated nature, we put forward the idea of real estate data as an instance of implicit volunteered geographic information and bring attention to their spatial aspect, potentially alleviating the challenge of acquiring spatial data of fine and dynamic urban features. We develop a mechanism of facilitating continuous acquisition, maintenance, and quality assurance of building data and associated amenities from real estate data. The results of the experiments conducted in Singapore reveal that one month of property listings provides information on 7% of the national building stock and about half of the residential subset, e.g. age, type, and storeys, which are often not available in sources such as OpenStreetMap, potentially supporting applications such as 3D city modelling and energy simulations. The method may serve as a novel means to spatial data quality control as it detects missing amenities and maps future buildings, which are advertised and transacted before they are built, but it exhibits mixed results in identifying unmapped buildings as ads may contain errors that impede the idea.nted water view imagery, and it is intended to support future studies.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ui-real-estate-mining/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ui-real-estate-mining/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/page-one_hu_fb03998e23eb9217.webp 400w,
/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/page-one_hu_f2600c4f69282b2a.webp 760w,
/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/page-one_hu_5864867d85c794bf.webp 1200w"
src="https://ual.sg/post/2023/10/10/urban-informatics-paper-of-the-year-award-2023/page-one_hu_fb03998e23eb9217.webp"
width="544"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ui_real_estate_mining&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Mining real estate ads and property transactions for building and amenity data acquisition}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Chen, Xinyu and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Urban Informatics}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1007/s44212-022-00012-2}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{12}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Recent talks by our researchers in international events</title><link>https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/</link><pubDate>Fri, 06 Oct 2023 10:14:49 +0800</pubDate><guid>https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/</guid><description>&lt;p&gt;We are happy that recently our PhD and junior researchers have been quite active in outreach, from giving talks to organising sessions and winning awards at the international scale.&lt;/p&gt;
&lt;p&gt;For example, &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;, &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;, and &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt; have taken part in the &lt;a href="https://www.mnd.gov.sg/urban-solutions-sustainability-r-d-congress-2023" target="_blank" rel="noopener"&gt;Urban Solutions &amp;amp; Sustainability R&amp;amp;D Congress 2023: Science of Cities Symposium&lt;/a&gt;.
Yujun has also presented at the Data Quality Domain Working Group member meeting of the &lt;a href="https://www.ogc.org" target="_blank" rel="noopener"&gt;Open Geospatial Consortium&lt;/a&gt; (her &lt;a href="https://ual.sg/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/"&gt;paper on establishing a framework for assessing data quality of street-level imagery&lt;/a&gt;), while Winston &amp;ndash; besides giving a talk at the same conference (on his project &lt;a href="https://ual.sg/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/"&gt;Urbanity&lt;/a&gt;) &amp;ndash; gave further ones at the &lt;a href="https://www.planning.org.au" target="_blank" rel="noopener"&gt;Planning Institute of Australia&lt;/a&gt; and the &lt;a href="https://cis.smu.edu.sg" target="_blank" rel="noopener"&gt;SMU College of Integrative Studies&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; has been at &lt;a href="https://www.3dgeoinfo.org/3dgeoinfo/" target="_blank" rel="noopener"&gt;3D GeoInfo 2023&lt;/a&gt; in Munich, where &lt;a href="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/"&gt;she has presented her work and won the best paper award&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Both Koichi and Winston have organised a notable session on street-level imagery as part of &lt;a href="http://sdss2023.spatial-data-science.net" target="_blank" rel="noopener"&gt;The Fourth Spatial Data Science Symposium (SDSS 2023)&lt;/a&gt;, which featured multiple speakers from different countries and 100+ participants.&lt;/p&gt;
&lt;p&gt;Some photos are included below.&lt;/p&gt;
&lt;p&gt;We are proud of our researchers and wish them continued successes.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/1_hu_a67cd5d7d6f042ca.webp 400w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/1_hu_f83c5ad63dae7037.webp 760w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/1_hu_f9baf3c5062e0732.webp 1200w"
src="https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/1_hu_a67cd5d7d6f042ca.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/2_hu_9c0662e59639f419.webp 400w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/2_hu_f6e092121c84785a.webp 760w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/2_hu_c12144d055a65594.webp 1200w"
src="https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/2_hu_9c0662e59639f419.webp"
width="760"
height="370"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/3_hu_4c3f1aa5add2fb16.webp 400w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/3_hu_1289c6d7ad32b5fc.webp 760w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/3_hu_3f052e43f4931772.webp 1200w"
src="https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/3_hu_4c3f1aa5add2fb16.webp"
width="760"
height="519"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/4_hu_ef395ff3ab8be0bd.webp 400w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/4_hu_9202faacc35daaf9.webp 760w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/4_hu_2c5c72348eb88c28.webp 1200w"
src="https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/4_hu_ef395ff3ab8be0bd.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/5_hu_3fec49e0c894e2a9.webp 400w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/5_hu_e04825dd18f10c07.webp 760w,
/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/5_hu_a0ad31e7c87173d3.webp 1200w"
src="https://ual.sg/post/2023/10/06/recent-talks-by-our-researchers-in-international-events/5_hu_3fec49e0c894e2a9.webp"
width="428"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Guest lecture by Charmaine Ng from the Kyoto Institute of Technology</title><link>https://ual.sg/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/</link><pubDate>Wed, 04 Oct 2023 22:04:49 +0800</pubDate><guid>https://ual.sg/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/</guid><description>&lt;p&gt;Earlier this week, our Lab and Department hosted Prof &lt;a href="https://www.hyokadb.jim.kit.ac.jp/profile/en.a8ad724387d2489b91764df21b29a00b.html" target="_blank" rel="noopener"&gt;Ming Shan (Charmaine) Ng&lt;/a&gt;,
&lt;a href="https://www.cpf.kit.ac.jp" target="_blank" rel="noopener"&gt;Center for the Possible Futures&lt;/a&gt;,
&lt;a href="https://www.kit.ac.jp/en/" target="_blank" rel="noopener"&gt;Kyoto Institute of Technology&lt;/a&gt;,
Japan &amp;#x1f1ef;&amp;#x1f1f5; for a guest lecture.&lt;/p&gt;
&lt;p&gt;Charmaine is currently a Project Associate Professor at the Kyoto Institute of Technology in Japan. She completed her PhD in the Department of Civil Engineering at ETH Zurich. Her chair’s research focuses on solving complex design problems with novel technologies and systemic solutions. She gave guest lectures on complex facade systems, sustainability, digital transformation, Society 5.0, network studies, project management and contracting, DfMA and lean at Princeton University, UBC, TU Graz et al. Also, she was trained as an architectural historian at the University of Cambridge. Alongside her academic achievements, she has been practising for 10+ years. She is a chartered architect in Switzerland and U.K., a qualified sustainable design professional (LEED AP) and a member of the American Society of Civil Engineers. She worked for multiple award-winning firms including Heatherwick Studio and BDP on many projects including the Google Bay View (IPD project) and Singapore Changi Terminal 5.&lt;/p&gt;
&lt;p&gt;Charmaine delivered the guest lecture &lt;em&gt;Realising Computational Design of Complex Geometry in Architecture&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/1_hu_8ddf5129b9d7c1a9.webp 400w,
/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/1_hu_c3ff8c92d977be7f.webp 760w,
/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/1_hu_ffb90b22cd28ceaa.webp 1200w"
src="https://ual.sg/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/1_hu_8ddf5129b9d7c1a9.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/poster_hu_51ed629399b72205.webp 400w,
/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/poster_hu_d22c86404e585254.webp 760w,
/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/poster_hu_5cc5abb2328d7f98.webp 1200w"
src="https://ual.sg/post/2023/10/04/guest-lecture-by-charmaine-ng-from-the-kyoto-institute-of-technology/poster_hu_51ed629399b72205.webp"
width="538"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The talk showcases research and practice about computational design and digital fabrication of complex architecture. It explains how architects in Europe, America and Asia realised computational design of complex geometry building envelopes and structures with some of her past international projects in practice as examples. Dr. Ng introduces multiple key techniques in design practices, such as Design for Manufacture and Assembly (DfMA) and post-rationalisation through computational approaches, industrialised construction methods such as modularisation, prefabrication and digital fabrication, as well as integrated design management approaches such as lean-based Target Value Design, early contractor involvement, Integrated Project Delivery (IPD) and relational contracting. She also presents her research in correspondence to the above topics and demonstrates how scientific research can guide current practice to foster digital transformation, sustainability and circularity. Besides, she discusses her theoretical research on socio-technical network studies using the Actor-Network Theory (ANT) and transdisciplinary studies on liabilities and governance in technology implementation.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>New paper and open dataset: A Global Feature-Rich Network Dataset of Cities and Dashboard for Comprehensive Urban Analyses</title><link>https://ual.sg/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/</link><pubDate>Sun, 01 Oct 2023 08:21:29 +0800</pubDate><guid>https://ual.sg/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Biljecki F (2023): A Global Feature-Rich Network Dataset of Cities and Dashboard for Comprehensive Urban Analyses. Scientific Data 10: 667. &lt;a href="https://doi.org/10.1038/s41597-023-02578-1" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41597-023-02578-1&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-sd-urbanitydata/2023-sd-urbanitydata.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;.
Congratulations on the great work! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://github.com/winstonyym/urbanity" target="_blank" rel="noopener"&gt;Urbanity&lt;/a&gt; is a network-based Python package developed by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt; at our NUS Urban Analytics Lab to automate the construction of feature rich (contextual and semantic) urban networks at any geographical scale. Through an accessible and simple to use interface, users can request heterogeneous urban information such as street view imagery, building morphology, population (including sub-group), and points of interest for target areas of interest.&lt;/p&gt;
&lt;p&gt;The newly released dataset aims to promote contextual analyses on urban networks through the integration and harmonisation of various open data sources. The paper describes the global network dataset and dashboard generated which covers 50 cities in 29 countries around the world. The dataset features 40+ pre-computed SVI, urban points of interest, urban population, network topology, and build morphology indicators. We also detail our validation process, for example, comparing Meta population density map to WorldPop estimates for 25 cities and using image visual complexity as a post-heuristic measure to assess image suitability.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/1_hu_25b262c46ea2485a.webp 400w,
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/1_hu_82cf4e808b03c6c5.webp 760w,
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/1_hu_4397482defc8c0ef.webp 1200w"
src="https://ual.sg/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/1_hu_25b262c46ea2485a.webp"
width="760"
height="601"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.6084/m9.figshare.22124219" target="_blank" rel="noopener"&gt;The dataset is released openly under Creative Commons 4.0 at Figshare&lt;/a&gt;.
The source code of the Urbanity dashboard is fully accessible &lt;a href="https://github.com/winstonyym/urbdash" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-npjus-urbanity/"&gt;A previous paper published in npj Urban Sustainability in July 2023&lt;/a&gt; presents the method and the open-source software that was used to generate this open dataset.
Please check it out as well.
The software can be found on the &lt;a href="https://github.com/winstonyym/urbanity" target="_blank" rel="noopener"&gt;Urbanity&amp;rsquo;s Github repository&lt;/a&gt; and it is extensively documented.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/2_hu_884e9378f5a3c3c2.webp 400w,
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/2_hu_a2f8386362dca806.webp 760w,
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/2_hu_29a02b224c66123d.webp 1200w"
src="https://ual.sg/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/2_hu_884e9378f5a3c3c2.webp"
width="760"
height="260"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban network analytics has become an essential tool for understanding and modeling the intricate complexity of cities. We introduce the Urbanity data repository to nurture this growing research field, offering a comprehensive, open spatial network resource spanning 50 major cities in 29 countries worldwide. Our workflow enhances OpenStreetMap networks with 40 + high-resolution indicators from open global sources such as street view imagery, building morphology, urban population, and points of interest, catering to a diverse range of applications across multiple fields. We extract streetscape semantic features from more than four million street view images using computer vision. The dataset’s strength lies in its thorough processing and validation at every stage, ensuring data quality and consistency through automated and manual checks. Accompanying the dataset is an interactive, web-based dashboard we developed which facilitates data access to even non-technical stakeholders. Urbanity aids various GeoAI and city comparative analyses, underscoring the growing importance of urban network analytics research.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-sd-urbanitydata/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-sd-urbanitydata/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/page-one_hu_de3e794f773ca854.webp 400w,
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/page-one_hu_1e8a5f7af2991da8.webp 760w,
/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/page-one_hu_38b2930d879aeccf.webp 1200w"
src="https://ual.sg/post/2023/10/01/new-paper-and-open-dataset-a-global-feature-rich-network-dataset-of-cities-and-dashboard-for-comprehensive-urban-analyses/page-one_hu_de3e794f773ca854.webp"
width="575"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_sd_urbanitydata&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yap, Winston and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s41597-023-02578-1}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Scientific Data}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{A Global Feature-Rich Network Dataset of Cities and Dashboard for Comprehensive Urban Analyses}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{667}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Global urban road network patterns</title><link>https://ual.sg/post/2023/09/30/new-paper-global-urban-road-network-patterns/</link><pubDate>Sat, 30 Sep 2023 15:25:16 +0800</pubDate><guid>https://ual.sg/post/2023/09/30/new-paper-global-urban-road-network-patterns/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen W, Huang H, Liao S, Gao F, Biljecki F (2024): Global urban road network patterns: Unveiling multiscale planning paradigms of 144 cities with a novel deep learning approach. Landscape and Urban Planning 241: 104901. &lt;a href="https://doi.org/10.1016/j.landurbplan.2023.104901" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2023.104901&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-land-urn/2024-land-urn.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/wangyang-chen/"&gt;Wangyang Chen&lt;/a&gt;.
Congratulations on the great and innovative work and publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The code can be found on its &lt;a href="https://github.com/ualsg/Global-road-network-patterns" target="_blank" rel="noopener"&gt;Github repository&lt;/a&gt;.
The paper is available below.&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://ual.sg/publication/2021-ceus-dl-morphology/"&gt;previous publication&lt;/a&gt; stemming from this research line, which set the foundation of this project, was published in CEUS in 2021.
This paper considerably expands it by introducing several innovations and novelties.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/30/new-paper-global-urban-road-network-patterns/1_hu_3b2afe11173321a0.webp 400w,
/post/2023/09/30/new-paper-global-urban-road-network-patterns/1_hu_bfc3687bafdbb517.webp 760w,
/post/2023/09/30/new-paper-global-urban-road-network-patterns/1_hu_34f0a758b5d34e8c.webp 1200w"
src="https://ual.sg/post/2023/09/30/new-paper-global-urban-road-network-patterns/1_hu_3b2afe11173321a0.webp"
width="760"
height="683"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/30/new-paper-global-urban-road-network-patterns/2_hu_43220f03b438f5f5.webp 400w,
/post/2023/09/30/new-paper-global-urban-road-network-patterns/2_hu_c14cd130fb8eed88.webp 760w,
/post/2023/09/30/new-paper-global-urban-road-network-patterns/2_hu_b1d08c43239f1786.webp 1200w"
src="https://ual.sg/post/2023/09/30/new-paper-global-urban-road-network-patterns/2_hu_43220f03b438f5f5.webp"
width="760"
height="721"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper is &lt;a href="https://authors.elsevier.com/a/1hrEwcUG5SifV" target="_blank" rel="noopener"&gt;available freely&lt;/a&gt; until 2023-11-19.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Comprehensive global study on urban road patterns with a novel approach.&lt;/li&gt;
&lt;li&gt;Road network patterns of 144 cities worldwide are identified with deep learning.&lt;/li&gt;
&lt;li&gt;Pattern similarities and disparities between and within cities are investigated.&lt;/li&gt;
&lt;li&gt;Multi-scale approach with new metrics enables uncovering hidden patterns.&lt;/li&gt;
&lt;li&gt;Road patterns correlate with urban socioeconomic and environmental conditions.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban road networks (URNs) are ubiquitous and essential components of cities. Visually, they present diverse patterns that embody latent planning principles. However, we still lack a global insight into such patterns. In this paper, we propose a scalable deep learning-based framework to automate accurate and multiscale classification of road network patterns in cities and present a comprehensive global implementation on 144 major cities around the world, yielding their multiscale pattern profiles and urban fabrics, highlighting both similarities and contrasts. We observe significant disparities across continents and regions, particularly at larger scales. We give particular attention to exploring inter-city pattern similarities with new metrics we introduce, and uncover subgroups in each continent, unveiling the potential intercontinental dissemination of planning paradigms. We establish four modes of intra-city spatial distribution of patterns considering diversity and clustering. Notably, radial road networks are found to be positively correlated with GDP per capita and negatively correlated with PM2.5 pollution. Our global study provides a new perspective to understand the URN texture of cities, which helps to understand the externalities of different road patterns and accordingly promote scientific and sustainable solutions for urban development.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-land-urn/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-land-urn/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/30/new-paper-global-urban-road-network-patterns/page-one_hu_830eacebe9f00c09.webp 400w,
/post/2023/09/30/new-paper-global-urban-road-network-patterns/page-one_hu_5a1d8fac9d7be103.webp 760w,
/post/2023/09/30/new-paper-global-urban-road-network-patterns/page-one_hu_f85df196927ef133.webp 1200w"
src="https://ual.sg/post/2023/09/30/new-paper-global-urban-road-network-patterns/page-one_hu_830eacebe9f00c09.webp"
width="567"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_land_urn&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Chen, Wangyang and Huang, Huiming and Liao, Shunyi and Gao, Feng and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2023.104901}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Global urban road network patterns: Unveiling multiscale planning paradigms of 144 cities with a novel deep learning approach}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{241}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104901}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Explainable spatially explicit geospatial artificial intelligence in urban analytics</title><link>https://ual.sg/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/</link><pubDate>Fri, 29 Sep 2023 18:25:16 +0800</pubDate><guid>https://ual.sg/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liu P, Zhang Y, Biljecki F (2024): Explainable spatially explicit geospatial artificial intelligence in urban analytics. Environment and Planning B: Urban Analytics and City Science, 51(5): 1104&amp;ndash;1123. &lt;a href="https://doi.org/10.1177/23998083231204689" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1177/23998083231204689&lt;/a&gt; &lt;a href="https://ual.sg/publication/2024-epb-xai/2024-epb-xai.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;.
Congratulations on the great work! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The developed code has been released open-source on its &lt;a href="https://github.com/PengyuanLiu1993/XAI-Urban-Analytics" target="_blank" rel="noopener"&gt;Github repository&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper is available below.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/1_hu_25eee0318926f7ba.webp 400w,
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/1_hu_96112e160f5a60e1.webp 760w,
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/1_hu_4294385e9a4a45eb.webp 1200w"
src="https://ual.sg/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/1_hu_25eee0318926f7ba.webp"
width="760"
height="272"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/2_hu_62268bda48740c07.webp 400w,
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/2_hu_68df242471cbafd2.webp 760w,
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/2_hu_c827465db8e83361.webp 1200w"
src="https://ual.sg/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/2_hu_62268bda48740c07.webp"
width="760"
height="336"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Geospatial artificial intelligence (GeoAI) is proliferating in urban analytics, where graph neural networks (GNNs) have become one of the most popular methods in recent years. However, along with the success of GNNs, the black box nature of AI models has led to various concerns (e.g. algorithmic bias and model misuse) regarding their adoption in urban analytics, particularly when studying socio-economics where high transparency is a crucial component of social justice. Therefore, the desire for increased model explainability and interpretability has attracted increasing research interest. This article proposes an explainable spatially explicit GeoAI-based analytical method that combines a graph convolutional network (GCN) and a graph-based explainable AI (XAI) method, called GNNExplainer. Here, we showcase the ability of our proposed method in two studies within urban analytics: traffic volume prediction and population estimation in the tasks of a node classification and a graph classification, respectively. For these tasks, we used Street View Imagery (SVI), a trending data source in urban analytics. We extracted semantic information from the images and assigned them as features of urban roads. The GCN first provided reasonable predictions related to these tasks by encoding roads as nodes and their connectivities and networks as graphs. The GNNExplainer then offered insights into how certain predictions are made. Through such a process, practical insights and conclusions can be derived from the urban phenomena studied here. In this paper we also set out a path for developing XAI in future urban studies.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2024-epb-xai/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2024-epb-xai/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/page-one_hu_e66b59f63a40c4b7.webp 400w,
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/page-one_hu_fc20e7ea115ec2bb.webp 760w,
/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/page-one_hu_1361f11880e35c48.webp 1200w"
src="https://ual.sg/post/2023/09/29/new-paper-explainable-spatially-explicit-geospatial-artificial-intelligence-in-urban-analytics/page-one_hu_e66b59f63a40c4b7.webp"
width="524"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2024_epb_xai&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liu, Pengyuan and Yan, Zhang and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1177/23998083231204689}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Environment and Planning B: Urban Analytics and City Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1104--1123}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Explainable spatially explicit geospatial artificial intelligence in urban analytics}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{51}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{5}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2024}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Our activities at the 11th International Conference on Urban Climate in Sydney</title><link>https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/</link><pubDate>Mon, 25 Sep 2023 09:36:49 +0800</pubDate><guid>https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/</guid><description>&lt;p&gt;The &lt;a href="https://icuc11.com/" target="_blank" rel="noopener"&gt;11th edition of the International Conference on Urban Climate (ICUC)&lt;/a&gt; was hosted in Sydney, Australia 🇦🇺.
This truly multidisciplinary conference was organised by the &lt;a href="https://www.unsw.edu.au" target="_blank" rel="noopener"&gt;University of New South Wales&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Big thanks to &lt;a href="https://research.unsw.edu.au/people/dr-negin-nazarian" target="_blank" rel="noopener"&gt;Negin Nazarian&lt;/a&gt; and &lt;a href="https://research.unsw.edu.au/people/associate-professor-melissa-anne-hart" target="_blank" rel="noopener"&gt;Melissa Hart&lt;/a&gt; for the meticulous and smooth organisation.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/1_hu_bca8fd2717641224.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/1_hu_de6e2980f5df18ba.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/1_hu_8d83a3ad76e5b671.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/1_hu_bca8fd2717641224.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/2_hu_726fd96be8fda881.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/2_hu_5b3bf1c76c22cd5.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/2_hu_664b17ddf1d3d911.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/2_hu_726fd96be8fda881.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/5_hu_930dc29f5190588b.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/5_hu_ebc6f6127903fd46.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/5_hu_f5afb4c6aba22018.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/5_hu_930dc29f5190588b.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Our research group and others from the National University of Singapore were well represented.
In particular, we shared details about two ongoing projects.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/marcel-ignatius/"&gt;Marcel Ignatius&lt;/a&gt; presented his multidisciplinary project &lt;em&gt;Data-driven models for understanding urban canopy air temperature distribution: a case study in the tropics&lt;/em&gt;.
This work is being carried out in collaboration with &lt;a href="https://ual.sg/author/kunihiko-fujiwara/"&gt;Kunihiko Fujiwara&lt;/a&gt;.
More information about this research can be found &lt;a href="https://www.linkedin.com/posts/marcelignatius_icuc11sydney-urbanclimate-microclimateanalysis-activity-7106133611975737344-Am9u" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; has shared details on the investigations on &lt;em&gt;Crowdsourced imagery for urban climate informatics&lt;/em&gt;, conducted with &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt;, &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;, and &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;.
Further, together with Mathew Lipson (Australian Bureau of Meteorology), he co-chaired the session &lt;em&gt;Urban data for climate modelling and sustainable cities&lt;/em&gt;, which included contributions from several countries.&lt;/p&gt;
&lt;p&gt;Also, he gave a guest lecture at the &lt;a href="https://www.sydney.edu.au" target="_blank" rel="noopener"&gt;University of Sydney&lt;/a&gt; &amp;ndash; &lt;a href="https://www.sydney.edu.au/medicine-health/our-research/research-centres/heat-and-health-research-incubator.html" target="_blank" rel="noopener"&gt;Heat &amp;amp; Health Research Incubator&lt;/a&gt;.
Many thanks to &lt;a href="https://federicotartarini.github.io" target="_blank" rel="noopener"&gt;Federico Tartarini&lt;/a&gt; for hosting it.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/4_hu_158fd78301dd94fc.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/4_hu_a293906e4e7813c7.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/4_hu_e1899ab903701245.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/4_hu_158fd78301dd94fc.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/7_hu_73a2c3f6f23a5336.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/7_hu_88d258efe4b835c4.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/7_hu_87f26668c1701d35.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/7_hu_73a2c3f6f23a5336.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/8_hu_99c3e4539466d9b4.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/8_hu_5f2f72ba0739a29.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/8_hu_8041b45ef4aed0de.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/8_hu_99c3e4539466d9b4.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/9_hu_7de63e5a0473e5d9.webp 400w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/9_hu_5c974fc5d717adb3.webp 760w,
/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/9_hu_da91b2c7629c6151.webp 1200w"
src="https://ual.sg/post/2023/09/25/our-activities-at-the-11th-international-conference-on-urban-climate-in-sydney/9_hu_7de63e5a0473e5d9.webp"
width="760"
height="351"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The hospitality is very much appreciated, and we look forward to the next instance of the conference.&lt;/p&gt;</description></item><item><title>Keynote at the City+2023@Perth International Conference</title><link>https://ual.sg/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/</link><pubDate>Thu, 21 Sep 2023 06:55:49 +0800</pubDate><guid>https://ual.sg/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/</guid><description>&lt;p&gt;The &lt;a href="https://yongzesong.com/cityplus-2023/" target="_blank" rel="noopener"&gt;2023 edition of the City+ International Conference&lt;/a&gt; was organised at Curtin University in Perth, Australia 🇦🇺.&lt;/p&gt;
&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the Lab by giving a keynote talk.&lt;/p&gt;
&lt;p&gt;Big thanks to &lt;a href="https://yongzesong.com" target="_blank" rel="noopener"&gt;Yongze Song&lt;/a&gt; and the team for the invitation and the fantastic organisation of the event!&lt;/p&gt;
&lt;p&gt;This year&amp;rsquo;s theme of the conference was &amp;lsquo;Geospatial Big Data and Artificial Intelligence for Cities&amp;rsquo;, and Yongze and the team attracted 300 participants from 29 countries.&lt;/p&gt;
&lt;p&gt;To follow the work of Yongze and the team, check out the &lt;a href="https://yongzesong.com" target="_blank" rel="noopener"&gt;website&lt;/a&gt;.
In 2024, Curtin University will host another &amp;lsquo;geo-conference&amp;rsquo;: the Mid-Term Symposium of the &lt;a href="https://www2.isprs.org/commissions/comm4/activities/" target="_blank" rel="noopener"&gt;ISPRS Technical Commission IV &amp;lsquo;Spatial Information Science&lt;/a&gt;&amp;rsquo;.&lt;/p&gt;
&lt;p&gt;The hospitality is very much appreciated, and we look forward to collaborating.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/1_hu_d5b3d6440a94729e.webp 400w,
/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/1_hu_2b4db46ff5b17c1e.webp 760w,
/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/1_hu_aa466a711e94b05e.webp 1200w"
src="https://ual.sg/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/1_hu_d5b3d6440a94729e.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/2_hu_fe9cf407d08f91db.webp 400w,
/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/2_hu_5a806c1354252180.webp 760w,
/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/2_hu_b15e716a586e3b73.webp 1200w"
src="https://ual.sg/post/2023/09/21/keynote-at-the-city-2023@perth-international-conference/2_hu_fe9cf407d08f91db.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Guest lecture by Angel Hsu from The University of North Carolina at Chapel Hill</title><link>https://ual.sg/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/</link><pubDate>Mon, 18 Sep 2023 18:01:49 +0800</pubDate><guid>https://ual.sg/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/</guid><description>&lt;p&gt;Our Lab and Department are hosting Prof &lt;a href="https://publicpolicy.unc.edu/person/hsu-angel/" target="_blank" rel="noopener"&gt;Angel Hsu&lt;/a&gt; from the &lt;a href="https://datadrivenlab.org" target="_blank" rel="noopener"&gt;Data-Driven EnviroLab&lt;/a&gt; at &lt;a href="https://www.unc.edu/" target="_blank" rel="noopener"&gt;The University of North Carolina at Chapel Hill&lt;/a&gt;,
USA &amp;#x1f1fa;&amp;#x1f1f8;&lt;/p&gt;
&lt;p&gt;Dr Angel Hsu is an Assistant Professor of Public Policy and the Environment at UNC-Chapel Hill. She is the founder and director of the Data-Driven EnviroLab, an interdisciplinary research group that applies quantitative approaches to pressing environmental issues. She focuses on the convergence of urbanization and climate change, specifically delving into how cities play dual roles as contributors to and potential solutions to climate change. She was a contributing author to the IPCC 6th Assessment Report and lead author of the 2018 UNEP Emissions Gap Report chapter on non-state and subnational actors. She regularly advises governments, and has chaired and contributed to the World Economic Forum’s Global Future Councils and was a 2018 TED Age of Amazement and 2020 TED Climate Countdown invited speaker. She has a Ph.D. in Environmental Policy from Yale University and was formerly an Assistant Professor of Environmental Studies at Yale-NUS College in Singapore from 2015-2020.&lt;/p&gt;
&lt;p&gt;Angel delivered the guest lecture &lt;em&gt;Cities on the Climate Frontlines: Evaluating Urban Climate Change and Policy Responses&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/2_hu_8a32155bceb7a712.webp 400w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/2_hu_4a27a12b3a807d.webp 760w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/2_hu_eaca6202ff1f6615.webp 1200w"
src="https://ual.sg/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/2_hu_8a32155bceb7a712.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/3_hu_a23ff3affa2e52dc.webp 400w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/3_hu_3e03675172a07e42.webp 760w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/3_hu_90c44199464fc5a6.webp 1200w"
src="https://ual.sg/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/3_hu_a23ff3affa2e52dc.webp"
width="760"
height="393"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/4_hu_c6bbe9b3e731a70f.webp 400w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/4_hu_8a137acedb5f5c10.webp 760w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/4_hu_d647d39c365e3497.webp 1200w"
src="https://ual.sg/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/4_hu_c6bbe9b3e731a70f.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/poster_hu_bfa21fc159b39c34.webp 400w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/poster_hu_b5daf61be6538424.webp 760w,
/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/poster_hu_637e59bb1264b614.webp 1200w"
src="https://ual.sg/post/2023/09/18/guest-lecture-by-angel-hsu-from-the-university-of-north-carolina-at-chapel-hill/poster_hu_bfa21fc159b39c34.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Cities are both contributors and potential problem solvers of the global climate crisis. They are also vulnerable to climate change impacts, including sea-level rise, extreme heat, and natural disasters. Consequently, they have risen to become prominent climate and environmental sustainability agents, with the UN’s Sustainable Development Goal 11 charging cities to be both sustainable and inclusive and the Paris Agreement’s recognition of their contributions. In this talk, I’ll introduce how my research is utilizing data science to evaluate cities’ contributions to global climate mitigation efforts and urban sustainable development more broadly.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>Best paper award and keynote at the 3D GeoInfo 2023 conference</title><link>https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/</link><pubDate>Sun, 17 Sep 2023 21:55:19 +0800</pubDate><guid>https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/</guid><description>&lt;p&gt;The &lt;a href="https://www.3dgeoinfo.org/3dgeoinfo/" target="_blank" rel="noopener"&gt;18th International 3D GeoInfo Conference 2023&lt;/a&gt; took place in Munich, Germany on 13-14 September 2023.
It was hosted by the &lt;a href="https://www.asg.ed.tum.de/en/gis/home/" target="_blank" rel="noopener"&gt;Chair of Geoinformatics at the Technical University of Munich&lt;/a&gt; led by &lt;a href="https://www.asg.ed.tum.de/en/gis/our-team/staff/prof-thomas-h-kolbe/" target="_blank" rel="noopener"&gt;Professor Thomas H. Kolbe&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;3D GeoInfo is the leading conference in this domain, and it started in 2006 in Kuala Lumpur, Malaysia under the auspices of Professor Alias Abdul Rahman and his research group.
We organised its &lt;a href="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/"&gt;2019 instance in Singapore&lt;/a&gt;, together with the Singapore Land Authority.&lt;/p&gt;
&lt;p&gt;Prior to the conference, a full-day workshop &amp;mdash; &lt;em&gt;International Forum on Urban Digital Twins&lt;/em&gt; &amp;mdash; was organised by the City of Munich in cooperation with TUM.&lt;/p&gt;
&lt;p&gt;Both the Forum and the conference had many interesting talks on the most recent developments in 3D GIS and urban digital twins.
Academics (mostly representing research groups in Europe), local governments (e.g. Munich, Barcelona, Utrecht, Helsinki, &amp;hellip;), and practioners were all well represented and significantly contributed.
Further, the European Commission has a presentation as well.&lt;/p&gt;
&lt;p&gt;Our research group was represented by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; has presented the paper &lt;em&gt;Humans as Sensors in Urban Digital Twins&lt;/em&gt;, which was coauthored with &lt;a href="https://ual.sg/author/yunlei-su/"&gt;Yunlei Su&lt;/a&gt; and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.
This paper was awarded the best paper of the conference. 🏆&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; gave a keynote &amp;mdash; &lt;em&gt;Research Insights and Perspectives of Urban Digital Twins&lt;/em&gt; &amp;mdash; presenting our latest work, especially in the domain of urban digital twins, which is conducted also in collaboration with other research groups at NUS.&lt;/p&gt;
&lt;p&gt;It was a great conference.
We highly appreciate the organisation by TU Munich and the City of Munich (please see the photo below for the full list of people involved).&lt;/p&gt;
&lt;p&gt;The next instance of the conference, in July 2024, will be organised in Vigo, Spain under the leadership of Lucía Díaz Vilariño.
See below for a slide with more information, including the dates.&lt;/p&gt;
&lt;p&gt;Our Lab remains committed to contributing to this vibrant community, and we very much look forward to the next instance of the conference.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/1_hu_ef395ff3ab8be0bd.webp 400w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/1_hu_9202faacc35daaf9.webp 760w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/1_hu_2c5c72348eb88c28.webp 1200w"
src="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/1_hu_ef395ff3ab8be0bd.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/2_hu_4b9a624a4525e9a5.webp 400w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/2_hu_71af8901ed04e375.webp 760w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/2_hu_40a6ffee9c4fa0bd.webp 1200w"
src="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/2_hu_4b9a624a4525e9a5.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/3_hu_7d513d366b51b102.webp 400w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/3_hu_bd2ba8d7ce4a4bc0.webp 760w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/3_hu_90dac57fa2845d9e.webp 1200w"
src="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/3_hu_7d513d366b51b102.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/4_hu_96090c038322cc29.webp 400w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/4_hu_9eb299501da090af.webp 760w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/4_hu_ba22085c1c6203bf.webp 1200w"
src="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/4_hu_96090c038322cc29.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/5_hu_8b393be1a6446ae7.webp 400w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/5_hu_2115eb668910ab06.webp 760w,
/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/5_hu_2806f27ee8865335.webp 1200w"
src="https://ual.sg/post/2023/09/17/best-paper-award-and-keynote-at-the-3d-geoinfo-2023-conference/5_hu_8b393be1a6446ae7.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Welcoming three new PhD researchers to the team</title><link>https://ual.sg/post/2023/08/07/welcoming-three-new-phd-researchers-to-the-team/</link><pubDate>Mon, 07 Aug 2023 19:28:49 +0800</pubDate><guid>https://ual.sg/post/2023/08/07/welcoming-three-new-phd-researchers-to-the-team/</guid><description>&lt;p&gt;A warm welcome to our three new researchers embarking on their doctoral journeys this month: &lt;a href="https://ual.sg/author/yixin-wu/"&gt;Yixin Wu&lt;/a&gt;, &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;, and &lt;a href="https://ual.sg/author/sijie-yang/"&gt;Sijie Yang&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;They will work on our research lines exploring emerging urban datasets and their new applications in data-driven urban planning.&lt;/p&gt;
&lt;p&gt;It is great to have them with us and we are excited to follow their progress towards developing innovative methods to harness urban data and transforming them into actionable insights that can help us build smarter and better cities. 🏙️💡&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/07/welcoming-three-new-phd-researchers-to-the-team/featured_hu_25194ae293319de.webp 400w,
/post/2023/08/07/welcoming-three-new-phd-researchers-to-the-team/featured_hu_efd97bdbe8420432.webp 760w,
/post/2023/08/07/welcoming-three-new-phd-researchers-to-the-team/featured_hu_a557a47eee3c2568.webp 1200w"
src="https://ual.sg/post/2023/08/07/welcoming-three-new-phd-researchers-to-the-team/featured_hu_25194ae293319de.webp"
width="760"
height="447"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Guest lecture by Gamze Dane from Eindhoven University of Technology</title><link>https://ual.sg/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/</link><pubDate>Mon, 07 Aug 2023 19:18:49 +0800</pubDate><guid>https://ual.sg/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/</guid><description>&lt;p&gt;This week, our Lab and department are hosting Dr &lt;a href="https://gamzedane.com" target="_blank" rel="noopener"&gt;Gamze Dane&lt;/a&gt;,
&lt;a href="https://www.tue.nl/en/our-university/departments/built-environment" target="_blank" rel="noopener"&gt;Department of Built Environment&lt;/a&gt;,
&lt;a href="https://www.tue.nl/en/" target="_blank" rel="noopener"&gt;Eindhoven University of Technology&lt;/a&gt;,
the Netherlands. &amp;#x1f1f3;&amp;#x1f1f1;&lt;/p&gt;
&lt;p&gt;Gamze Dane is an Assistant Professor of Digital Urban Development at the Department of Built Environment of Eindhoven University of Technology (TU/e). She is also one of the lead investigators of the Urban Development Initiative (UDI) in the Eindhoven Region. She has an interdisciplinary background with a Ph.D. in “Urban Planning” and a MSc. in “Geographical Information Systems (GIS) and Decision Making”. Her areas of expertise include decision-support systems in participatory urban planning, human-environment interaction, GIS and data analytics. By being the principal investigator of numerous large-scale European and national innovation projects, she has gained vast experience with transdisciplinary research demonstrated by working with citizens, SMEs, NGOs and European cities (i.e., Eindhoven (NL), Helmond (NL), Bologna (IT), Lisbon (PT), Skopje (NMK), Istanbul (TR)).&lt;/p&gt;
&lt;p&gt;To kickstart her stay, Gamze delivered the guest lecture &lt;em&gt;Experiencing the Future of Cities through Virtual Reality&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/1_hu_275bc4910e91fa3d.webp 400w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/1_hu_825a7d7aba689ab1.webp 760w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/1_hu_69e3f12296326485.webp 1200w"
src="https://ual.sg/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/1_hu_275bc4910e91fa3d.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/2_hu_1161f7b2ee3e5211.webp 400w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/2_hu_57fd271b98d8b99d.webp 760w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/2_hu_4764e526204cce5c.webp 1200w"
src="https://ual.sg/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/2_hu_1161f7b2ee3e5211.webp"
width="571"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/3_hu_4a3c4ccdcf92b139.webp 400w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/3_hu_27f1df262d7de6b4.webp 760w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/3_hu_9f85734a54c5e89a.webp 1200w"
src="https://ual.sg/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/3_hu_4a3c4ccdcf92b139.webp"
width="760"
height="552"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/poster_hu_f21d5cfaa9f12500.webp 400w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/poster_hu_5eaff168e55dcb84.webp 760w,
/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/poster_hu_efcd493428e8c89e.webp 1200w"
src="https://ual.sg/post/2023/08/07/guest-lecture-by-gamze-dane-from-eindhoven-university-of-technology/poster_hu_f21d5cfaa9f12500.webp"
width="538"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;To achieve sustainable, inclusive, and livable cities, it is necessary for citizens and urban planning stakeholders to efficiently experience, discuss, and comprehend the consequences of various alternative urban intervention scenarios. This process serves two key purposes: (i) enhancing the planning practice by fostering participation and communication among stakeholders and citizens, and (ii) facilitating informed decision-making in urban planning. To this end, Virtual Reality (VR) technology offers a promising solution by providing immersive and experiential tools that allow users to envision and explore the future of cities through simulated environments.
This presentation will showcase examples of projects where VR technology was utilized to&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;support participatory design processes. Through the development of an interactive immersive VR application, users were enabled to experience, create, and discuss different design options for healthy public space design.&lt;/li&gt;
&lt;li&gt;gather data by tracking users’ behavior and choices within virtual environments that represent future urban scenarios. These insights are then used as input for agent-based models, which will predict the acceptance and use of future urban interventions.&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;</description></item><item><title>Visits to Thai universities and organisations</title><link>https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/</link><pubDate>Sun, 06 Aug 2023 18:55:49 +0800</pubDate><guid>https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/</guid><description>&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the Lab during a recent research visit to Thailand &amp;#x1f1f9;&amp;#x1f1ed;.&lt;/p&gt;
&lt;p&gt;He has been invited to give guest lectures and participate in collaborative exchanges at multiple leading Thai institutions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.chula.ac.th/en/" target="_blank" rel="noopener"&gt;Chulalongkorn University&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://ait.ac.th/" target="_blank" rel="noopener"&gt;Asian Institute of Technology&lt;/a&gt;, &lt;a href="https://ait.ac.th/centre/geoinformatics-center/" target="_blank" rel="noopener"&gt;Geoinformatics Centre&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.cmu.ac.th/en/" target="_blank" rel="noopener"&gt;Chiang Mai University&lt;/a&gt;, &lt;a href="https://www.cmu.ac.th/en/faculty/engineering/aboutus/head" target="_blank" rel="noopener"&gt;Faculty of Engineering&lt;/a&gt; (a press release by the university can be found &lt;a href="https://eng.cmu.ac.th/?p=34277" target="_blank" rel="noopener"&gt;here&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.uddc.net/" target="_blank" rel="noopener"&gt;Urban Design and Development Center&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Besides the guest lectures and workshops, he met vice presidents, deans and heads of departments, and discussed collaboration opportunities.&lt;/p&gt;
&lt;p&gt;Many thanks for the hosts for the insightful and productive discussions, and sharing the inspiring work.
The hospitality is very much appreciated, and we look forward to collaborating.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/1_hu_c4dce94c9b0c8e6a.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/1_hu_2409dbf7263cbec0.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/1_hu_e2f9a762ba38f791.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/1_hu_c4dce94c9b0c8e6a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/2_hu_138d3d28b74205c0.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/2_hu_8fcf3a911c4e07ad.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/2_hu_c0348419290e9a48.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/2_hu_138d3d28b74205c0.webp"
width="760"
height="355"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/3_hu_85dfbce6dcd9b775.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/3_hu_de85981dd91877b9.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/3_hu_c8ee044bf111ac51.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/3_hu_85dfbce6dcd9b775.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/4_hu_9435c36379781887.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/4_hu_63f1a899bf63bbad.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/4_hu_c980e1e54cc7351d.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/4_hu_9435c36379781887.webp"
width="760"
height="565"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/5_hu_542799d038f14b9a.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/5_hu_f5a4a2a2dec2a386.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/5_hu_6c62f921efd07f08.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/5_hu_542799d038f14b9a.webp"
width="760"
height="505"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/6_hu_31161d78e005b828.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/6_hu_e73fc9685cc8ab28.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/6_hu_619cefb260e033a.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/6_hu_31161d78e005b828.webp"
width="428"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/7_hu_c85f294567d5a705.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/7_hu_10c7032c653905fa.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/7_hu_56f30a42122d8836.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/7_hu_c85f294567d5a705.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/8_hu_fab52cf0c8e79e8d.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/8_hu_be4776bfccf8832e.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/8_hu_d552bf42f3b14fd6.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/8_hu_fab52cf0c8e79e8d.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/9_hu_aa2c4fa25983c5ca.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/9_hu_13c7739afd642654.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/9_hu_76c659add80d3b33.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/9_hu_aa2c4fa25983c5ca.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/10_hu_24824f401a5eb8f4.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/10_hu_cd32a6448885e039.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/10_hu_d805c702dcc9a07c.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/10_hu_24824f401a5eb8f4.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/08/06/visits-to-thai-universities-and-organisations/11_hu_dc0cdf8a14bd6498.webp 400w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/11_hu_251dffdcbf86cbbf.webp 760w,
/post/2023/08/06/visits-to-thai-universities-and-organisations/11_hu_5cadbda8309b4b83.webp 1200w"
src="https://ual.sg/post/2023/08/06/visits-to-thai-universities-and-organisations/11_hu_dc0cdf8a14bd6498.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper and open-source software: Urbanity - automated modelling and analysis of multidimensional networks in cities</title><link>https://ual.sg/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/</link><pubDate>Mon, 31 Jul 2023 18:25:16 +0800</pubDate><guid>https://ual.sg/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Stouffs R, Biljecki F (2023): Urbanity: automated modelling and analysis of multidimensional networks in cities. npj Urban Sustainability 3: 45. &lt;a href="https://doi.org/10.1038/s42949-023-00125-w" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s42949-023-00125-w&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-npjus-urbanity/2023-npjus-urbanity.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;.
Congratulations on the great work on both the software and publication! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;Urbanity is a network-based Python package developed by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt; at our NUS Urban Analytics Lab to automate the construction of feature rich (contextual and semantic) urban networks at any geographical scale. Through an accessible and simple to use interface, users can request heterogeneous urban information such as street view imagery, building morphology, population (including sub-group), and points of interest for target areas of interest.&lt;/p&gt;
&lt;p&gt;Urbanity is designed in an object-oriented approach that parallels the urban planning process. The urban data science pipeline starts with a base map which users can use to explore their site. Subsequently, there are two ways to specify geographical area of interest: 1) drawing with the polygon and box tools provided; or 2) providing your own polygon shapefiles (all common formats .shp/.geojson are supported).&lt;/p&gt;
&lt;p&gt;Towards exploring complexities underlying urban systems and facilitating comparative study between cities, Urbanity is developed to facilitate downstream descriptive, modelling, and predictive urban analytical tasks.&lt;/p&gt;
&lt;p&gt;The features of Urbanity are as follows:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Rapid city-scale network generation&lt;/li&gt;
&lt;li&gt;Seamless computation of metric, topological, contextual, and semantic network indicators&lt;/li&gt;
&lt;li&gt;Node and edge spatial context computation&lt;/li&gt;
&lt;li&gt;Areal statistics for arbitrary urban subzones&lt;/li&gt;
&lt;li&gt;Validity checks for OpenStreetMap attribute completeness (no. of buildings, percentage with height, percentage with levels, etc.)&lt;/li&gt;
&lt;li&gt;Primal planar, dual, and spatial graph generation&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The code can be found on the &lt;a href="https://github.com/winstonyym/urbanity" target="_blank" rel="noopener"&gt;Urbanity&amp;rsquo;s Github repository&lt;/a&gt;.
The paper is available below.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/1_hu_4593101199520ecd.webp 400w,
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/1_hu_3ee775a31c3d6be2.webp 760w,
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/1_hu_62f37b70aa9ec748.webp 1200w"
src="https://ual.sg/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/1_hu_4593101199520ecd.webp"
width="760"
height="386"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/2_hu_da9ef6d81ec5d1ad.webp 400w,
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/2_hu_caaa54a34850d545.webp 760w,
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/2_hu_a8c72b2a2f898a13.webp 1200w"
src="https://ual.sg/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/2_hu_da9ef6d81ec5d1ad.webp"
width="760"
height="345"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban networks play a vital role in connecting multiple urban components and developing our understanding of cities and urban systems. Despite the significant progress we have made in understanding how city networks are connected and spread out, we still have a lot to learn about the meaning and context of these networks. The increasing availability of open data offers opportunities to supplement urban networks with specific location information and create more expressive urban machine-learning models. In this work, we introduce Urbanity, a network-based Python package to automate the construction of feature-rich urban networks anywhere and at any geographical scale. We discuss data sources, the features of our software, and a set of data representing the networks of five major cities around the world. We also test the usefulness of added context in our networks by classifying different types of connections within a single network. Our findings extend accumulated knowledge about how spaces and flows within city networks work, and affirm the importance of contextual features for analyzing city networks.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-npjus-urbanity/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-npjus-urbanity/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/page-one_hu_9180925051fa09.webp 400w,
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/page-one_hu_dbd8e24f081991d4.webp 760w,
/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/page-one_hu_4776a7277886d944.webp 1200w"
src="https://ual.sg/post/2023/07/31/new-paper-and-open-source-software-urbanity-automated-modelling-and-analysis-of-multidimensional-networks-in-cities/page-one_hu_9180925051fa09.webp"
width="555"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_npjus_urbanity&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yap, Winston and Stouffs, Rudi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s42949-023-00125-w}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{npj Urban Sustainability}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Urbanity: automated modelling and analysis of multidimensional networks in cities}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{3}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{45}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visit by UDDC from Chulalongkorn University</title><link>https://ual.sg/post/2023/07/28/visit-by-uddc-from-chulalongkorn-university/</link><pubDate>Fri, 28 Jul 2023 17:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/07/28/visit-by-uddc-from-chulalongkorn-university/</guid><description>&lt;p&gt;We were thrilled to welcome a group of exceptional young researchers led by Professor Niramon Serisakul from Chulalongkorn University, Thailand 🇹🇭, to the NUS Urban Analytics Lab.&lt;/p&gt;
&lt;p&gt;Their visit to our research group and the state-of-the-art SDE4 facility at NUS was truly an honour and a testament to the growing importance of urban analytics and city science in Southeast Asia.&lt;/p&gt;
&lt;p&gt;We eagerly look forward to the exciting possibilities for joint research endeavours with these talented designers and look forward to applying cutting-edge data-driven solutions to urban challenges in Southeast Asia and beyond.
Check out the amazing work of the Urban Design and Development Center at their &lt;a href="https://www.uddc.net/" target="_blank" rel="noopener"&gt;website&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Visits to Takenaka Corporation and Japanese universities</title><link>https://ual.sg/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/</link><pubDate>Sat, 22 Jul 2023 11:11:49 +0800</pubDate><guid>https://ual.sg/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/</guid><description>&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the Lab during a recent research visit to Japan &amp;#x1f1ef;&amp;#x1f1f5;.
He has been invited to meetings at the R&amp;amp;D Institute of &lt;a href="https://www.takenaka.co.jp/takenaka_e/" target="_blank" rel="noopener"&gt;Takenaka Corporation&lt;/a&gt;.
We have recently started &lt;a href="https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/"&gt;collaborating&lt;/a&gt; with this prominent Japanese construction company and are excited to expand it.&lt;/p&gt;
&lt;p&gt;While in Japan, Filip visited also the following universities:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Kyoto University&lt;/li&gt;
&lt;li&gt;Kyoto Institute of Technology&lt;/li&gt;
&lt;li&gt;University of Tokyo&lt;/li&gt;
&lt;li&gt;Ritsumeikan University&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;for collaborative exchanges and guest lectures at departments and research groups in our domain.&lt;/p&gt;
&lt;p&gt;Thank you for the insightful and productive discussions, and sharing the inspiring work.
The hospitality is very much appreciated, and we look forward to collaborating.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/1_hu_a6a64746d970fb9f.webp 400w,
/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/1_hu_a00df67ff9572a80.webp 760w,
/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/1_hu_1c76bd8cfeecad27.webp 1200w"
src="https://ual.sg/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/1_hu_a6a64746d970fb9f.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/2_hu_cdbe6a925006eb2b.webp 400w,
/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/2_hu_ebdf572b0dd7f24d.webp 760w,
/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/2_hu_70f575325d8227a7.webp 1200w"
src="https://ual.sg/post/2023/07/22/visits-to-takenaka-corporation-and-japanese-universities/2_hu_cdbe6a925006eb2b.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Visit by Prof Carlo Ratti from MIT</title><link>https://ual.sg/post/2023/07/18/visit-by-prof-carlo-ratti-from-mit/</link><pubDate>Tue, 18 Jul 2023 13:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/07/18/visit-by-prof-carlo-ratti-from-mit/</guid><description>&lt;p&gt;It was an honour to have Professor Carlo Ratti from the &lt;a href="https://senseable.mit.edu" target="_blank" rel="noopener"&gt;MIT Senseable City Lab&lt;/a&gt; visit our research group at the net-zero energy building SDE4 of our College of Design and Engineering.
We look forward to collaborations with his esteemed research group at the Massachusetts Institute of Technology.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/18/visit-by-prof-carlo-ratti-from-mit/1_hu_15a3c0290c150de1.webp 400w,
/post/2023/07/18/visit-by-prof-carlo-ratti-from-mit/1_hu_d22dc35735349be4.webp 760w,
/post/2023/07/18/visit-by-prof-carlo-ratti-from-mit/1_hu_88cff6a6b0404b20.webp 1200w"
src="https://ual.sg/post/2023/07/18/visit-by-prof-carlo-ratti-from-mit/1_hu_15a3c0290c150de1.webp"
width="760"
height="528"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Visit by Dr Zhang Fan from the Hong Kong University of Science and Technology</title><link>https://ual.sg/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/</link><pubDate>Thu, 13 Jul 2023 13:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/</guid><description>&lt;p&gt;Our Lab and department hosted Dr &lt;a href="https://www.ce.ust.hk/people/fan-zhang-zhangfan" target="_blank" rel="noopener"&gt;Zhang Fan&lt;/a&gt; from the &lt;a href="https://www.ce.ust.hk" target="_blank" rel="noopener"&gt;Department of Civil and Environmental Engineering&lt;/a&gt; at the &lt;a href="https://hkust.edu.hk" target="_blank" rel="noopener"&gt;Hong Kong University of Science and Technology&lt;/a&gt;. 🇭🇰&lt;/p&gt;
&lt;p&gt;Fan previously served as a senior research fellow at MIT and the leader of the Urban Visual AI group at the MIT Senseable City Lab.
His research sits at the intersection of urban Informatics, data-driven approaches for urban studies, and geographic artificial intelligence.
He has published papers in journals such as PNAS, Nature Communications, Nature Reviews Earth &amp;amp; Environment, and was included in Stanford&amp;rsquo;s list of the world&amp;rsquo;s top 2% scientists in 2022.
Dr Zhang is currently an associate editor of Transactions in Urban Data, Science, and Technology, and a guest editor of ISPRS Journal of Photogrammetry and Remote Sensing.
He has served as a reviewer for over 50 SCI journals in GIS and urban studies.
He received the Global Young Scientist Award in Frontier Science and Technology from WGDC in 2020 and the Geospatial World 50 Rising Stars Award in 2022.&lt;/p&gt;
&lt;p&gt;During his stay, Fan delivered the lecture &lt;em&gt;Sensing Cities with Street-Level Imagery&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/1_hu_b035cce7441ed6f4.webp 400w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/1_hu_7a1c064a18429d14.webp 760w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/1_hu_bdaf1d2431ea8806.webp 1200w"
src="https://ual.sg/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/1_hu_b035cce7441ed6f4.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/2_hu_99b4f0f318d824ee.webp 400w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/2_hu_4dec76a911cdfa3a.webp 760w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/2_hu_afcf96c3e2a84b2c.webp 1200w"
src="https://ual.sg/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/2_hu_99b4f0f318d824ee.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/3_hu_5728f9844c1793e1.webp 400w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/3_hu_17a6e9941c489b8b.webp 760w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/3_hu_b5000a9e050cb535.webp 1200w"
src="https://ual.sg/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/3_hu_5728f9844c1793e1.webp"
width="760"
height="571"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/poster_hu_9a38c4521dbf8e85.webp 400w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/poster_hu_134c28515e88d656.webp 760w,
/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/poster_hu_c554e265d7137c7e.webp 1200w"
src="https://ual.sg/post/2023/07/13/visit-by-dr-zhang-fan-from-the-hong-kong-university-of-science-and-technology/poster_hu_9a38c4521dbf8e85.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The integration of digital technologies into the physical space has led to significant changes in our understanding, perception, design, and experience of cities. The advent of geospatial big data and advancements in AI have created new possibilities for sensing urban dynamics and evaluating the effects of urbanization. One type of geospatial data that has attracted attention recently is street-level imagery, which enables urban physical environment to be observed from a human perspective. Recent developments in AI technologies have provided strong support for extracting semantic information from street-level imagery and quantifying urban physical environment, which not only observing urban physical environment from the human perspective, but also reveals human activities and socio-economic environments, providing new perspectives for the research on human-land relationships and spatial data mining and knowledge discovery. This presentation will introduce case studies of street-level imagery under the framework of urban visual intelligence.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>Keynote at CUSP London on 3D Urban Models: Applications and Digital Twins</title><link>https://ual.sg/post/2023/07/05/keynote-at-cusp-london-on-3d-urban-models-applications-and-digital-twins/</link><pubDate>Wed, 05 Jul 2023 07:20:49 +0800</pubDate><guid>https://ual.sg/post/2023/07/05/keynote-at-cusp-london-on-3d-urban-models-applications-and-digital-twins/</guid><description>&lt;p&gt;&lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; gave a keynote presentation at &lt;a href="https://www.eventbrite.co.uk/e/3d-urban-models-applications-and-digital-twins-tickets-657449387817" target="_blank" rel="noopener"&gt;3D Urban Models: Applications and Digital Twins&lt;/a&gt;, an online event that was part of the &lt;a href="https://www.londondataweek.org" target="_blank" rel="noopener"&gt;London Data Week&lt;/a&gt;, focusing on a wide range of 3D City model related topics: new visualization methods and future trends in VR, AR &amp;amp; Digital Twins.&lt;/p&gt;
&lt;p&gt;The event was organised by the &lt;a href="https://www.kcl.ac.uk/research/cusp" target="_blank" rel="noopener"&gt;Centre for Urban Science and Progress (CUSP)&lt;/a&gt;, King&amp;rsquo;s College London.&lt;/p&gt;
&lt;p&gt;We presented our research on urban data modelling and digital twins.
For the latest publications on this topic, check out &lt;a href="https://ual.sg/publication"&gt;our papers&lt;/a&gt;.
Also, make sure to check Binyu&amp;rsquo;s &lt;a href="https://ual.sg/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/"&gt;latest paper&lt;/a&gt; on uncovering challenges to the adoption and management of digital twins in the urban realm.&lt;/p&gt;
&lt;p&gt;We appreciate the invitation and thank Professor &lt;a href="https://www.kcl.ac.uk/people/nicolas-s.-holliman" target="_blank" rel="noopener"&gt;Nick Holliman&lt;/a&gt; and his colleagues for the organisation.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/05/keynote-at-cusp-london-on-3d-urban-models-applications-and-digital-twins/1_hu_aa24594630777249.webp 400w,
/post/2023/07/05/keynote-at-cusp-london-on-3d-urban-models-applications-and-digital-twins/1_hu_4510f66b6fefbd55.webp 760w,
/post/2023/07/05/keynote-at-cusp-london-on-3d-urban-models-applications-and-digital-twins/1_hu_507373fb280899c.webp 1200w"
src="https://ual.sg/post/2023/07/05/keynote-at-cusp-london-on-3d-urban-models-applications-and-digital-twins/1_hu_aa24594630777249.webp"
width="428"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;</description></item><item><title>Our University is now ranked top 10 worldwide</title><link>https://ual.sg/post/2023/07/03/our-university-is-now-ranked-top-10-worldwide/</link><pubDate>Mon, 03 Jul 2023 14:08:19 +0800</pubDate><guid>https://ual.sg/post/2023/07/03/our-university-is-now-ranked-top-10-worldwide/</guid><description>&lt;p&gt;According to the results of the latest Quacquarelli Symonds (QS) World University Rankings (WUR) 2024, NUS now ranks eighth in the world and first in Asia.&lt;/p&gt;
&lt;p&gt;It is the first time that an Asian university has reached top 10 globally, joining the likes of MIT, Stanford, Harvard, Cambridge, Oxford, ETH Zurich, Berkeley&amp;hellip;&lt;/p&gt;
&lt;p&gt;We include below the &lt;a href="https://news.nus.edu.sg/qs-world-university-rankings-2024-nus-rises-three-places-to-rank-within-global-top-8" target="_blank" rel="noopener"&gt;press release&lt;/a&gt; by the University.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;In a historic first for the University, NUS has broken into the worldwide top 10, according to results from the latest Quacquarelli Symonds (QS) World University Rankings (WUR) released on 28 June 2023. Now ranking 8th in the world, NUS is also the top university in Asia.&lt;/p&gt;
&lt;p&gt;The University leapt three spots in this year’s rankings after holding steady at 11th for the past five years. This puts NUS within the top 1 per cent of all universities evaluated this year, ranking among many highly prestigious institutions worldwide.&lt;/p&gt;
&lt;p&gt;Launched in 2003 and now in its 20th edition, the QS WUR is a portfolio of comparative university rankings released annually. The 2024 edition features 2,963 evaluated institutions across 104 locations, and is based on a methodology that considers a range of factors, including academic reputation, employer reputation, research impact, global engagement, and sustainability.&lt;/p&gt;
&lt;p&gt;“We are delighted with NUS’ excellent performance, ranking 8th in the world and top in Asia according to the latest QS World University Rankings, and the recognition accorded as one of the leading universities in the world, and in Asia. This is a historic first for NUS to be placed within the top ten globally, amongst many other prestigious institutions worldwide,” said NUS President Professor Tan Eng Chye.&lt;/p&gt;
&lt;p&gt;“This is a testament to our capabilities and commitment to providing a world-class, and interdisciplinary education to nurture agile and resilient graduates with diverse skills and knowledge for an ever-changing world. This achievement was made possible by the excellent contributions and outstanding work of our talented faculty, staff and students who remain deeply committed to flying the flag of our distinguished quality of education and creating positive impact in the classroom and beyond.”&lt;/p&gt;
&lt;p&gt;&amp;ldquo;We are thrilled to acknowledge the historic achievement of the National University of Singapore in joining the top 10 of the 20th of the QS World University Rankings,&amp;rdquo; shared Ms Jessica Turner, QS Chief Executive. &amp;ldquo;This unprecedented milestone marks a remarkable moment for Asian higher education and showcases NUS&amp;rsquo; dedication to research excellence, innovation, and sustainability.&amp;rdquo;&lt;/p&gt;
&lt;p&gt;She further added, &amp;ldquo;NUS&amp;rsquo; remarkable ascent not only highlights its outstanding academic performance but also underscores its commitment to producing graduates sought after by employers, as demonstrated by its impressive 7th place globally in the new Employment Outcomes indicator. The university&amp;rsquo;s distinguished faculty and researchers play a pivotal role in advancing knowledge and making significant contributions across diverse fields.&amp;rdquo;&lt;/p&gt;
&lt;p&gt;The full results of the QS World University Rankings 2024 are available at: &lt;a href="https://www.topuniversities.com" target="_blank" rel="noopener"&gt;https://www.topuniversities.com&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;NUS ranks within top universities in Asia&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;NUS has also been consistently ranked among the best universities in Asia. The University came in third place for the fourth consecutive year in the latest ranking of Asian universities released by Times Higher Education (THE) on 22 June 2023.&lt;/p&gt;
&lt;p&gt;China’s Tsinghua University topped the list, with Peking University placed second in the 11th edition of THE’s Asia University Rankings.&lt;/p&gt;
&lt;p&gt;Comprising 669 universities from 31 territories, the 2023 ranking assessed the universities on their core missions – teaching, research, knowledge transfer and international outlook. The THE Asia University Rankings is based on the same 13 performance indicators used in THE’s World University Rankings, which are recalibrated to reflect the attributes of Asia’s institutions. NUS was ranked 19th in the 2023 THE World University Rankings.&lt;/p&gt;</description></item><item><title>Our PhD researcher Binyu Lei's work featured in GIM International as a cover story</title><link>https://ual.sg/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/</link><pubDate>Sun, 02 Jul 2023 18:08:19 +0800</pubDate><guid>https://ual.sg/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/</guid><description>&lt;p&gt;&lt;a href="https://www.gim-international.com/" target="_blank" rel="noopener"&gt;GIM International&lt;/a&gt;, a leading magazine for geospatial professionals, has published &lt;a href="https://www.gim-international.com/content/article/uncovering-the-challenges-of-urban-digital-twins" target="_blank" rel="noopener"&gt;an article&lt;/a&gt; that features the PhD research of &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt; on the adoption and management of urban digital twins.
It is published as a cover story.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.gim-international.com/content/article/uncovering-the-challenges-of-urban-digital-twins" target="_blank" rel="noopener"&gt;The article&lt;/a&gt; summarises a recent journal paper published in &lt;em&gt;Automation in Construction&lt;/em&gt;, available &lt;a href="https://ual.sg/publication/2023-autcon-dt-challenges/"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The work was co-authored with Patrick Janssen, Jantien Stoter (&lt;a href="https://3d.bk.tudelft.nl/" target="_blank" rel="noopener"&gt;TU Delft&lt;/a&gt;), and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;You can check it out also below:&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/1_hu_eb70a757e15bb926.webp 400w,
/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/1_hu_416eaf752a083b12.webp 760w,
/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/1_hu_e67129fad1085bbd.webp 1200w"
src="https://ual.sg/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/1_hu_eb70a757e15bb926.webp"
width="760"
height="540"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/2_hu_d15d4376a5eb6266.webp 400w,
/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/2_hu_ab4dbfaff1ccd7c1.webp 760w,
/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/2_hu_98a9b43533d7584d.webp 1200w"
src="https://ual.sg/post/2023/07/02/our-phd-researcher-binyu-leis-work-featured-in-gim-international-as-a-cover-story/2_hu_d15d4376a5eb6266.webp"
width="760"
height="538"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Or as a &lt;a href="https://ual.sg/publication/2023-gim-dt-challenges/2023-gim-dt-challenges.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;.&lt;/p&gt;
&lt;p&gt;If you want to cite this research in a scientific context, please refer to the originating journal paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Janssen P, Stoter J, Biljecki F (2023): Challenges of Urban Digital Twins: A Systematic Review and a Delphi Expert Survey. &lt;em&gt;Automation in Construction&lt;/em&gt; 147: 104716. &lt;a href="https://doi.org/10.1016/j.autcon.2022.104716" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.autcon.2022.104716&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-autcon-dt-challenges/2023-autcon-dt-challenges.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_autcon_dt_challenges&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Challenges of Urban Digital Twins: A Systematic Review and a Delphi Expert Survey}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Janssen, Patrick and Stoter, Jantien and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Automation in Construction}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.autcon.2022.104716}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104716}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{147}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visits to Switzerland and Austria</title><link>https://ual.sg/post/2023/06/30/visits-to-switzerland-and-austria/</link><pubDate>Fri, 30 Jun 2023 08:24:49 +0800</pubDate><guid>https://ual.sg/post/2023/06/30/visits-to-switzerland-and-austria/</guid><description>&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the Lab during a recent research visit at the Swiss Federal Institute of Technology (ETH) Zurich related to the project &lt;a href="https://fcl.ethz.ch/research/integration-and-strategies/semantic-urban-elements.html" target="_blank" rel="noopener"&gt;Semantic Urban Elements&lt;/a&gt; run at the &lt;a href="https://fcl.ethz.ch" target="_blank" rel="noopener"&gt;FCL Global&lt;/a&gt;, &lt;a href="https://sec.ethz.ch" target="_blank" rel="noopener"&gt;Singapore-ETH Centre&lt;/a&gt;.
He was kindly hosted by the &lt;a href="https://coss.ethz.ch/" target="_blank" rel="noopener"&gt;Chair of Computational Social Science&lt;/a&gt;, our partner group in the project.&lt;/p&gt;
&lt;p&gt;In addition, he gave guest lectures and participated in a variety of activities to further our network at the following institutes and groups: ETH Zurich &amp;ndash; &lt;a href="https://ikg.ethz.ch/en/" target="_blank" rel="noopener"&gt;Institute of Cartography and Geoinformation&lt;/a&gt; (kindly organised by the &lt;a href="https://gis.ethz.ch/en/" target="_blank" rel="noopener"&gt;Chair of Geoinformation Engineering&lt;/a&gt;), University of Zurich &amp;ndash; Department of Geography &amp;ndash; &lt;a href="https://www.geo.uzh.ch/en/units.html" target="_blank" rel="noopener"&gt;GIScience Center&lt;/a&gt; (kindly organised by the &lt;a href="https://www.geo.uzh.ch/en/units/giva.html" target="_blank" rel="noopener"&gt;Geographic Information Visualization and Analysis (GIVA)&lt;/a&gt;), and the University of Vienna (meeting the group of Prof &lt;a href="https://www.univie.ac.at/en/research/research-overview/neue-professuren-ab-2020/detailansicht-en/artikel/univ-prof-dr-krzysztof-janowicz/" target="_blank" rel="noopener"&gt;Krzysztof Janowicz&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;Thank you for the insightful and productive discussions, and sharing the inspiring work.
The hospitality is very much appreciated, and we look forward to collaborating.&lt;/p&gt;</description></item><item><title>New paper: Sensitivity of measuring the urban form and greenery using street-level imagery</title><link>https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/</link><pubDate>Sun, 18 Jun 2023 00:40:16 +0800</pubDate><guid>https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Zhao T, Liang X, Hou Y (2023): Sensitivity of measuring the urban form and greenery using street-level imagery: A comparative study of approaches and visual perspectives. &lt;em&gt;International Journal of Applied Earth Observation and Geoinformation&lt;/em&gt; 122: 103385. &lt;a href="https://doi.org/10.1016/j.jag.2023.103385" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.jag.2023.103385&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-jag-svi-sensitivity/2023-jag-svi-sensitivity.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, and contributed by &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt;, &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;, and &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper clarifies two matters related to street view imagery (SVI), which are arguably relevant but have not been examined yet:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Understanding the different ways that SVI is used to measure the urban form: green view index, sky view factor&amp;hellip; These metrics are omnipresent in hundreds of papers, but can actually be computed in different ways, and the approaches have not been compared.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Understanding whether crowdsourced SVI (e.g. Mapillary &amp;amp; KartaView) can replace data from far more popular commercial sources such as Google Street View and Baidu Maps.
This is a broad topic because these data provenances differ in many ways &amp;ndash; completeness, image quality, etc. (we did publish &lt;a href="https://ual.sg/publication/2022-jag-svi-quality/"&gt;a paper&lt;/a&gt; on that, led by &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;).
This new paper tackles one aspect: whether single images (like those collected from smartphones and dashcams) can be employed for the same use cases as panoramic images are routinely used for.
Most notably, the crowdsourced images usually have a limited view, as opposed to a panoramic insight in the streetscape, and we looked into whether that may be an obstacle.
We suspected it may be, because &lt;a href="https://ual.sg/publication/2021-land-svi-review/"&gt;the review paper we published in 2021&lt;/a&gt; found that only 1-2% studies use crowdsourced data. But is that really an obstacle?&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;To investigate both these aspects, we sourced 670k images from 5 cities around the world (this is another contribution since studies that use SVI rarely focus on multiple cities).
We simulated about 100 scenarios of different consumer cameras and settings of capture, e.g. capturing imagery looking at the right side of driving, at a field of view of 140 degrees and aspect ratio 16/9.
We ended up having 70 million images to analyse, deriving 210 million metrics (since we focused on 3 metrics: greenery, sky, and buildings).&lt;/p&gt;
&lt;p&gt;Long story short:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;We demonstrate that different ways result in having similar (R=0.82 for green view) to pretty much the same (R=0.98 for sky view) results. Therefore, different approaches of deriving similar metrics are mostly equivalent.&lt;/li&gt;
&lt;li&gt;Single images of a particular setting (e.g. for the scenario described above &amp;ndash; FOV=140, AR=16/9, D=0, we established that R is 0.92, pretty high).
That means that (i) crowdsourced imagery may be more useful than many may think; and (ii) when using GSV data, maybe we don&amp;rsquo;t need the whole panorama, and we can simplify the process by using just one perspective image. This is important to our research group, since much of our research is increasingly shifting towards crowdsourced data, and some of us already use Mapillary and its limited images extensively (e.g. &lt;a href="https://ual.sg/publication/2023-epb-semantic-networks/"&gt;the recent work&lt;/a&gt; by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt; published in EPB).&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;You are welcome to check out the comprehensive findings in &lt;a href="https://ual.sg/publication/2023-jag-svi-sensitivity/"&gt;the open access paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/1_hu_cab94557f3725e7e.webp 400w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/1_hu_fbd04e3037e5fac6.webp 760w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/1_hu_89e93e55890c5645.webp 1200w"
src="https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/1_hu_cab94557f3725e7e.webp"
width="760"
height="685"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/2_hu_cadd9d0741bc5163.webp 400w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/2_hu_7945518e658b7703.webp 760w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/2_hu_510d18d7a6731924.webp 1200w"
src="https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/2_hu_cadd9d0741bc5163.webp"
width="670"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/3_hu_5260c75a79c7a5f1.webp 400w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/3_hu_2219c8e4fb53ea69.webp 760w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/3_hu_ec8b393740dc6b1e.webp 1200w"
src="https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/3_hu_5260c75a79c7a5f1.webp"
width="595"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/4_hu_914cfb463f20a1d9.webp 400w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/4_hu_5ca31766452f0fe7.webp 760w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/4_hu_c244aebfd152636f.webp 1200w"
src="https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/4_hu_914cfb463f20a1d9.webp"
width="760"
height="307"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/5_hu_3a5f1bc4ff1f958f.webp 400w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/5_hu_cfbe737c6c8a4f66.webp 760w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/5_hu_588c4bebf3103fc5.webp 1200w"
src="https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/5_hu_3a5f1bc4ff1f958f.webp"
width="748"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Street View Imagery (SVI) is crucial in estimating indicators such as Sky View Factor (SVF) and Green View Index (GVI), but (1) approaches and terminology differ across fields such as planning, transportation and climate, potentially causing inconsistencies; (2) it is unknown whether the regularly used panoramic imagery is actually essential for such tasks, or we can use only a portion of the imagery, simplifying the process; and (3) we do not know if non-panoramic (single-frame) photos typical in crowdsourced platforms can serve the same purposes as panoramic ones from services such as Google Street View and Baidu Maps for their limited perspectives. This study is the first to examine comprehensively the built form metrics, the influence of different practices on computing them across multiple fields, and the usability of normal photos (from consumer cameras). We overview approaches and run experiments on 70 million images in 5 cities to analyse the impact of a multitude of variants of SVI on characterising the physical environment and mapping street canyons: a few panoramic approaches (e.g. fisheye) and 96 scenarios of perspective imagery with variable directions, fields of view, and aspect ratios mirroring diverse photos from smartphones and dashcams. We demonstrate that (1) disparate panoramic approaches give different but mostly comparable results in computing the same metric (e.g. from R=0.82 for Green View to R=0.98 for Sky View metrics); and (2) often (e.g. when using a front-facing ultrawide camera), single-frame images can derive results comparable to commercial panoramic counterparts. This finding may simplify typical processes of using panoramic data and also unlock the value of billions of crowdsourced images, which are often overlooked, and can benefit scores of locations worldwide not yet covered by commercial services. Further, when aggregated for city-scale analyses, the results correspond closely.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Comprehensive examination of approaches to estimating the urban form.&lt;/li&gt;
&lt;li&gt;Different characteristics of data impact the measurements.&lt;/li&gt;
&lt;li&gt;Novel method studying uncertainty in a controlled, simulated and scalable environment.&lt;/li&gt;
&lt;li&gt;Multi-dimensional and multi-city experiments on buildings, greenery, and sky view.&lt;/li&gt;
&lt;li&gt;Reliability of single (crowdsourced) imagery is comparable to commercial panoramas.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-jag-svi-sensitivity/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-jag-svi-sensitivity/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/page-one_hu_46155118fb28aa23.webp 400w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/page-one_hu_ee67bb43712ee78b.webp 760w,
/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/page-one_hu_504bdec518c522e9.webp 1200w"
src="https://ual.sg/post/2023/06/18/new-paper-sensitivity-of-measuring-the-urban-form-and-greenery-using-street-level-imagery/page-one_hu_46155118fb28aa23.webp"
width="584"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_jag_svi_sensitivity&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Filip Biljecki and Tianhong Zhao and Xiucheng Liang and Yujun Hou}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.jag.2023.103385}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Applied Earth Observation and Geoinformation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{103385}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Sensitivity of measuring the urban form and greenery using street-level imagery: A comparative study of approaches and visual perspectives}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{122}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Call for papers for a special issue (until June 2024)</title><link>https://ual.sg/post/2023/06/17/call-for-papers-for-a-special-issue-until-june-2024/</link><pubDate>Sat, 17 Jun 2023 23:40:16 +0800</pubDate><guid>https://ual.sg/post/2023/06/17/call-for-papers-for-a-special-issue-until-june-2024/</guid><description>&lt;p&gt;We announce a new special issue in the &lt;a href="https://www.sciencedirect.com/journal/international-journal-of-applied-earth-observation-and-geoinformation" target="_blank" rel="noopener"&gt;International Journal of Applied Earth Observation and Geoinformation&lt;/a&gt; on Sustainable geospatial analytics and geoinformatics with repeatable, reproducible, and expandable (RRE) framework and design.
We welcome submissions until June 2024.&lt;/p&gt;
&lt;p&gt;This is not the first time we are working on a special issue.
Four years ago, we organised &lt;a href="https://ual.sg/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/"&gt;one on 3D GIS and BIM in Transactions of GIS&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The guest editors of this special issue are Siqin (Sisi) Wang (University of Queensland, RMIT University, University of Tokyo), Xiao Huang (University of Arkansas), Filip Biljecki (NUS), Francisco Rowe (University of Liverpool), and Veruska Muccione (University of Zurich).&lt;/p&gt;
&lt;p&gt;The call for papers is available below and at &lt;a href="https://www.sciencedirect.com/journal/international-journal-of-applied-earth-observation-and-geoinformation/about/call-for-papers#sustainable-geospatial-analytics-and-geoinformatics-with-repeatable-reproducible-and-expandable-rre-framework-and-design" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The journal is an open access journal that has a relatively large &lt;a href="https://www.sciencedirect.com/journal/international-journal-of-applied-earth-observation-and-geoinformation/about/aims-and-scope" target="_blank" rel="noopener"&gt;scope&lt;/a&gt;, attracting a variety of high quality papers from many research groups around the world.
We also published a couple of papers in this journal (e.g. &lt;a href="https://ual.sg/publication/2023-jag-svi-sensitivity/"&gt;here&lt;/a&gt; and &lt;a href="https://ual.sg/publication/2022-jag-geoai/"&gt;here&lt;/a&gt;) in the past year, and we can recommend it.&lt;/p&gt;
&lt;p&gt;For the list of the ongoing special special issues in the journal, click &lt;a href="https://www.sciencedirect.com/journal/international-journal-of-applied-earth-observation-and-geoinformation/special-issues" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="call-for-papers--sustainable-geospatial-analytics-and-geoinformatics-with-repeatable-reproducible-and-expandable-rre-framework-and-design"&gt;Call for papers &amp;mdash; Sustainable geospatial analytics and geoinformatics with repeatable, reproducible, and expandable (RRE) framework and design&lt;/h2&gt;
&lt;p&gt;In recent decades, GIScience has undergone significant development, leading us toward an era that emphasizes sustainability, sharing, repeatability, and reproducibility. Sustainable spatiotemporal data analysis is a critical aspect of understanding social and environmental processes and predicting their long-term outcomes. To achieve this, frameworks and designs that are repeatable, reproducible, and easily re-employed by a wide range of end-users, policymakers, and individuals without technical expertise are required. A sustainable approach to spatiotemporal data analysis involves considering the persistence and sharing of data collection, management, methodologies and workflows. This requires proper training and practice in data collection, including minimizing data errors and biases, using open data standards, and appropriately documenting the data and methods. Additionally, a repeatable and reproducible framework for data analysis ensures that results can be validated and replicated, thus increasing confidence in the findings. In essence, sustainable spatiotemporal data analytics promotes the cutting-edge frontier of human-centered open science, thereby advancing the field and leading us toward a more sustainable and equitable future.&lt;/p&gt;
&lt;p&gt;As such, this special issue aims to promote the development of repeatable, reproducible, and expandable (RRE) frameworks, methodologies, and technologies (e.g., use of KNIME workflow) for spatial data analysis, spatial data sharing, and applied research in the fields of spatiotemporal innovation. It welcomes papers with the following topics:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Review articles on the development of RRE frameworks, methodologies, and technologies&lt;/li&gt;
&lt;li&gt;Technical development of new RRE frameworks and methodologies centered around spatiotemporal data&lt;/li&gt;
&lt;li&gt;Empirical studies using spatiotemporal data as well as RRE frameworks and methodologies in the cross-subdomain of geography and sustainability science, including but not limited to:&lt;/li&gt;
&lt;/ol&gt;
&lt;ul&gt;
&lt;li&gt;sustainable society&lt;/li&gt;
&lt;li&gt;urban sustainability&lt;/li&gt;
&lt;li&gt;environmental sustainability&lt;/li&gt;
&lt;li&gt;ecological sustainability&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;All submissions are required to have sharable data, codes, framework and methods, expected to be designated as a RRE workflow e.g., using KNIME, Model Builder, or other workflow software.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Submission deadline: June 9, 2024&lt;/em&gt;&lt;/p&gt;
&lt;h3 id="keywords"&gt;Keywords:&lt;/h3&gt;
&lt;p&gt;Geospatial analytics, GIScience, Geoinformatics, open science, sharable workflow, repeatability, reproducibility, spatiotemporal innovation&lt;/p&gt;
&lt;h3 id="why-publish-in-this-special-issue"&gt;Why publish in this Special Issue?&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Special Issue articles are published together on ScienceDirect, making it incredibly easy for other researchers to discover your work.&lt;/li&gt;
&lt;li&gt;Special content articles are downloaded on ScienceDirect twice as often within the first 24 months than articles published in regular issues.&lt;/li&gt;
&lt;li&gt;Special content articles attract 20% more citations in the first 24 months than articles published in regular issues.&lt;/li&gt;
&lt;li&gt;All articles in this special issue will be reviewed by no fewer than two independent experts to ensure the quality, originality and novelty of the work published.&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>New paper: Insights in a city through the eyes of Airbnb reviews: Sensing urban characteristics from homestay guest experiences</title><link>https://ual.sg/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/</link><pubDate>Tue, 06 Jun 2023 18:25:16 +0200</pubDate><guid>https://ual.sg/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wang J, Chow YS, Biljecki F (2023): Insights in a city through the eyes of Airbnb reviews: Sensing urban characteristics from homestay guest experiences. Cities 140: 104399. &lt;a href="https://doi.org/10.1016/j.cities.2023.104399" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2023.104399&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-cities-airbnb/2023-cities-airbnb.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/jiaxuan-wang/"&gt;Jiaxuan Wang&lt;/a&gt;.
Congratulations on the great work and publication! &amp;#x1f64c; &amp;#x1f44f;
Jiaxuan has graduated from our NUS Master of Urban Planning programme.&lt;/p&gt;
&lt;p&gt;Until 2023-07-26, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1hCbTy5jOr5h-" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/1_hu_b17fae07cc1e434c.webp 400w,
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/1_hu_dccc9c34e9f83975.webp 760w,
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/1_hu_8bf1206e07e0d914.webp 1200w"
src="https://ual.sg/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/1_hu_b17fae07cc1e434c.webp"
width="648"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/2_hu_77f306533660a99e.webp 400w,
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/2_hu_dc5dcf1bdf429865.webp 760w,
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/2_hu_3ced0bf5f8b02933.webp 1200w"
src="https://ual.sg/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/2_hu_77f306533660a99e.webp"
width="760"
height="317"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;There is a growing interest in deriving insights about cities from crowdsourced data. We advance the discourse by employing homestay guest experience to sense urban characteristics. We evaluate the relationship between subjective perceptions and objective indicators thanks to rich information in textual reviews that we posit reflect urban qualities. Next, we investigate dominant topics about urban characteristics in Airbnb reviews (transportation, greenery, amenities, safety, and noise) with natural language processing techniques, i.e. a rule-based dependency parsing method designed to extract relevant information. Then, we establish the associations between sentiments and proxies representing the physical patterns of urban areas. The multi-scale results of the experiments in three cities (London, Singapore, and NYC) suggest that reviews on homestay platforms reflect transportation convenience, amenities, sense of safety, and noise pollution. The correlation is stronger at a higher administrative division level, while the perception of people on safety is more sensitive at a more granular scale. Densities of transportation and amenities in nearby districts are more likely to be perceived similarly. Furthermore, the spatial distribution of perceptions is possibly affected by the morphology and development of a city, and the diversity of guests. This study reveals new possibilities for sensing urban characteristics through user-generated information and introduces a new application of accommodation reviews, which may help alleviate gaps in availability of data required for planning.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Text reviews on homestay platforms reflect physical conditions of a city.&lt;/li&gt;
&lt;li&gt;114,340 Airbnb listings and 899,776 reviews were analysed using dependency parsing.&lt;/li&gt;
&lt;li&gt;Five dominant urban aspects common in Airbnb reviews in London, NYC, and Singapore&lt;/li&gt;
&lt;li&gt;The homestay reviews suggest quality of access to transportation and amenities.&lt;/li&gt;
&lt;li&gt;Human perceptions on urban characteristics vary on different administrative levels.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-cities-airbnb/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-cities-airbnb/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/page-one_hu_c74eb32bd8e0ce4a.webp 400w,
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/page-one_hu_4cfa879be2a34ea.webp 760w,
/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/page-one_hu_c3855d10902e561a.webp 1200w"
src="https://ual.sg/post/2023/06/06/new-paper-insights-in-a-city-through-the-eyes-of-airbnb-reviews-sensing-urban-characteristics-from-homestay-guest-experiences/page-one_hu_c74eb32bd8e0ce4a.webp"
width="591"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_cities_airbnb&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Jiaxuan Wang and Yoong Shin Chow and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2023.104399}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104399}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Insights in a city through the eyes of Airbnb reviews: Sensing urban characteristics from homestay guest experiences}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{140}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: How spatio-temporal resolution impacts urban energy calibration</title><link>https://ual.sg/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/</link><pubDate>Fri, 26 May 2023 08:22:16 +0800</pubDate><guid>https://ual.sg/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Dilsiz AD, Nweye KE, Wu AJ, Kämpf JH, Biljecki F, Nagy Z (2023): How spatio-temporal resolution impacts urban energy calibration. &lt;em&gt;Energy and Buildings&lt;/em&gt; 292: 113175. &lt;a href="https://doi.org/10.1016/j.enbuild.2023.113175" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.enbuild.2023.113175&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-enb-ubem-resolution/2023-enb-ubem-resolution.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The paper addresses the crucial role of Urban Energy Models (UBEMs) in evaluating the energy performance of buildings, ranging from individual buildings to entire districts.
It articulates the need for consistent reporting on accuracy in UBEM, and it introduces a multi-dimensional Level of Detail (LoD) specification for UBEM, including geometry, thermal zoning, and spatio-temporal resolution of the measured data used to calibrate the models.&lt;/p&gt;
&lt;p&gt;The project was spearheaded by &lt;a href="https://www.uwyo.edu/civil/faculty_staff/faculty/aysegul-demir/" target="_blank" rel="noopener"&gt;Aysegul Demir Dilsiz&lt;/a&gt; (who is now Asst Prof at the University of Wyoming) and &lt;a href="https://www.ie-lab.org/author/zoltan-nagy/" target="_blank" rel="noopener"&gt;Zoltan Nagy&lt;/a&gt; from the &lt;a href="https://www.ie-lab.org" target="_blank" rel="noopener"&gt;Intelligent Environments Laboratory&lt;/a&gt; at the University of Texas at Austin.&lt;/p&gt;
&lt;p&gt;Until 2023-07-14, the article is available for free via &lt;a href="https://authors.elsevier.com/c/1h8R9_8dCXk2Vr" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/1_hu_abc47f9c57dc4024.webp 400w,
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/1_hu_2c6be9eae9a9162d.webp 760w,
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/1_hu_4ea395898f77c89c.webp 1200w"
src="https://ual.sg/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/1_hu_abc47f9c57dc4024.webp"
width="490"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/2_hu_9380396223360982.webp 400w,
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/2_hu_1a8da127af0039ee.webp 760w,
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/2_hu_ad8c6e78a29a9801.webp 1200w"
src="https://ual.sg/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/2_hu_9380396223360982.webp"
width="760"
height="476"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Building Energy Modeling tools help forecast the energy performance of buildings. Urban energy models (UBEMs) emerged as important instruments to analyze the energy performance of buildings aggregated at different spatial resolutions, from the building level to the district level. They heavily rely on available data on geometries and measurements to create accurately calibrated energy models. However, limited research has been conducted to understand the impact of spatial and temporal resolution on the simulation results because of the difficulty of comparing results and not having a standardized procedure to report simulation errors. We review the literature on UBEM validation compared to measured energy data and show the discrepancies in the reporting accuracy. We articulate the need for consistent reporting on model accuracy and introduce a multi-dimensional Level of Detail (LoD) specification for UBEM, including geometry, thermal zoning, and spatio-temporal resolution of the measured data used to calibrate the models. Using a university campus with 70 buildings as an extensive case study, we demonstrate the performance of Bayesian calibration from the building level to the aggregated level. Our results suggest that the accuracy of urban energy prediction with annual temporal resolution can be significantly increased if calibration is performed by using building-level data. However, whenever privacy is a concern, then the data should be provided by aggregating them based on primary use type. Additionally, using monthly data to calibrate uncertain input parameters is not improving the accuracy of the models because the obtained posterior distributions for the selected parameters are not informative for monthly data. To improve this shortcoming, we suggest seasonal calibration, which is computationally costly.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-enb-ubem-resolution/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-enb-ubem-resolution/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/page-one_hu_192cb0eb27ddda43.webp 400w,
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/page-one_hu_a9ee5616ed39048.webp 760w,
/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/page-one_hu_45fbb2bf7222e18e.webp 1200w"
src="https://ual.sg/post/2023/05/26/new-paper-how-spatio-temporal-resolution-impacts-urban-energy-calibration/page-one_hu_192cb0eb27ddda43.webp"
width="592"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_enb_ubem_resolution&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Aysegul Demir Dilsiz and Kingsley E. Nweye and Allen J. Wu and Jérôme H. Kämpf and Filip Biljecki and Zoltan Nagy}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.enbuild.2023.113175}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Energy and Buildings}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{113175}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{How spatio-temporal resolution impacts urban energy calibration}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{292}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Revealing spatio-temporal evolution of urban visual environments with street view imagery</title><link>https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/</link><pubDate>Thu, 18 May 2023 19:45:16 +0800</pubDate><guid>https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liang X, Zhao T, Biljecki F (2023): Revealing spatio-temporal evolution of urban visual environments with street view imagery. Landscape and Urban Planning 237: 104802. &lt;a href="https://doi.org/10.1016/j.landurbplan.2023.104802" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2023.104802&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-landup-svi-evolution/2023-landup-svi-evolution.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;.
Congratulations on the great work and publication! &amp;#x1f64c; &amp;#x1f44f;
Xiucheng has graduated from our NUS Master of Urban Planning programme.&lt;/p&gt;
&lt;p&gt;Until 2023-07-06, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1h5iEcUG5SiP%7E" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/1_hu_207af5de451059ba.webp 400w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/1_hu_96e2d0be21cabf4.webp 760w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/1_hu_acd0d794168fc486.webp 1200w"
src="https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/1_hu_207af5de451059ba.webp"
width="612"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/2_hu_48960c5530a534af.webp 400w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/2_hu_f5570509c4a11975.webp 760w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/2_hu_9178ea5f77206908.webp 1200w"
src="https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/2_hu_48960c5530a534af.webp"
width="718"
height="519"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/3_hu_eaddfdea520563e7.webp 400w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/3_hu_feb999914d0b19cc.webp 760w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/3_hu_a0246f170f2408cb.webp 1200w"
src="https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/3_hu_eaddfdea520563e7.webp"
width="760"
height="715"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/4_hu_f4f583637f7a71df.webp 400w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/4_hu_3a5f405735a69bf1.webp 760w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/4_hu_46a7541355d22caf.webp 1200w"
src="https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/4_hu_f4f583637f7a71df.webp"
width="730"
height="432"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The visual landscape plays a pivotal role in urban planning and healthy cities. Recent studies of visual evaluation focus on either objective or subjective approach, while describing the visual character holistically and monitor its evolution remains challenging. This study introduces an embedding-driven clustering approach that integrates both physical and perceptual attributes to infer the spatial structure of the visual environment, and investigates its spatio-temporal evolution. Singapore, a highly urbanised yet green city, is selected as a case study. Firstly, a visual feature matrix is derived from street view imagery (SVI). Then, a graph neural network is constructed based on road connections to encode visual features and spatial dependency leading to a clustering algorithm that is used to discover the underlying characteristics of the visual environment. The implementation characterises streetscapes of the city-state into six types of clusters. Finally, taking advantage of historical SVI, a longitudinal analysis reveals how visual clusters have evolved in the past decade. Among them, one of the clusters represents high-density visual experience, affirming the work as such streetscape dominates the central business district and it is evolving elsewhere, mirroring the expansion of new towns. In turn, another identified cluster, indicating sparse landscapes, decreased, while areas that are considered to be in the most visually pleasant cluster, increased. For the first time, this study demonstrates a novel method to understand the urban visual structure and analyse its spatio-temporal evolution, which could support future planning decision-making and urban landscape betterment.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;A neural network approach for unsupervised visual environment clustering.&lt;/li&gt;
&lt;li&gt;Integration of multivariate and spatial relationship in urban environment studies.&lt;/li&gt;
&lt;li&gt;Exploration of time series street view imagery.&lt;/li&gt;
&lt;li&gt;The spatial-temporal evolution analysis of visual environment was obtained.&lt;/li&gt;
&lt;li&gt;The method has established six clusters of the streetscape in Singapore.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-landup-svi-evolution/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-landup-svi-evolution/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/page-one_hu_8564ce8b85ef1cf0.webp 400w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/page-one_hu_7d4adbaaf68dbc29.webp 760w,
/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/page-one_hu_812721b60e1fb0b5.webp 1200w"
src="https://ual.sg/post/2023/05/18/new-paper-revealing-spatio-temporal-evolution-of-urban-visual-environments-with-street-view-imagery/page-one_hu_8564ce8b85ef1cf0.webp"
width="583"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_landup_svi_evolution&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Xiucheng Liang and Tianhong Zhao and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2023.104802}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104802}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Revealing spatio-temporal evolution of urban visual environments with street view imagery}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{237}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visits to Liverpool and Heidelberg</title><link>https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/</link><pubDate>Fri, 28 Apr 2023 18:24:49 +0800</pubDate><guid>https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/</guid><description>&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the research group during a recent visit to Europe where he gave guest lectures and participated in a variety of activities to further our network:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;University of Liverpool, &lt;a href="https://www.liverpool.ac.uk/geographic-data-science/" target="_blank" rel="noopener"&gt;Geographic Data Science Lab&lt;/a&gt; &amp;#x1f1ec;&amp;#x1f1e7;&lt;/li&gt;
&lt;li&gt;Heidelberg University, &lt;a href="https://www.geog.uni-heidelberg.de/gis/index_en.html" target="_blank" rel="noopener"&gt;GIScience Research Group&lt;/a&gt; and &lt;a href="https://heigit.org" target="_blank" rel="noopener"&gt;Heidelberg Institute for Geoinformation Technology&lt;/a&gt; &amp;#x1f1e9;&amp;#x1f1ea;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These labs are leaders in our domain.&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://www.liverpool.ac.uk/geographic-data-science/" target="_blank" rel="noopener"&gt;Geographic Data Science Lab&lt;/a&gt; in Liverpool is a multidisciplinary group that serves as a centre of excellence for research and teaching within this emerging area, drawing expertise from the intersection of Geographic Information Science, Spatial Analysis and Applied Geocomputation. It is headed by Professor &lt;a href="https://www.franciscorowe.com" target="_blank" rel="noopener"&gt;Francisco Rowe&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-1_hu_4a0da2d4ac7fe151.webp 400w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-1_hu_215ea10354b7c5f9.webp 760w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-1_hu_a76bac78e8d3288b.webp 1200w"
src="https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-1_hu_4a0da2d4ac7fe151.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-2_hu_f9ac40649997d832.webp 400w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-2_hu_b3d22758756cd43.webp 760w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-2_hu_f7b4dd34973d38fc.webp 1200w"
src="https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/l-2_hu_f9ac40649997d832.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://www.geog.uni-heidelberg.de/gis/index_en.html" target="_blank" rel="noopener"&gt;GIScience Research Group&lt;/a&gt; and &lt;a href="https://heigit.org" target="_blank" rel="noopener"&gt;HeiGIT&lt;/a&gt; (Heidelberg Institute for Geoinformation Technology) in Heidelberg are one of the leading groups in the domain of geographical information science, especially in user-generated geographical content (VGI, Crowdsourcing, Citizen Science). They are headed by Professor &lt;a href="https://www.geog.uni-heidelberg.de/gis/zipf.html" target="_blank" rel="noopener"&gt;Alexander Zipf&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-1_hu_b34a779f009e2b5d.webp 400w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-1_hu_178f0723c8acd04b.webp 760w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-1_hu_6c264731fdb02d2f.webp 1200w"
src="https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-1_hu_b34a779f009e2b5d.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-2_hu_75cee168d50db3c6.webp 400w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-2_hu_8148832de8777492.webp 760w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-2_hu_d9ade38465bfca74.webp 1200w"
src="https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-2_hu_75cee168d50db3c6.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-3_hu_d632bc7af52bca59.webp 400w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-3_hu_e7391b397bdd2804.webp 760w,
/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-3_hu_402f2bcdbe44beb2.webp 1200w"
src="https://ual.sg/post/2023/04/28/visits-to-liverpool-and-heidelberg/h-3_hu_d632bc7af52bca59.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Both visits included guest lectures and several discussions on collaboration.&lt;/p&gt;
&lt;p&gt;Thank you for the insightful and productive discussions, and sharing the inspiring work.
The hospitality is very much appreciated, and we look forward to collaboration.&lt;/p&gt;</description></item><item><title>New paper: Developing a multiview spatiotemporal model based on deep graph neural networks to predict the travel demand by bus</title><link>https://ual.sg/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/</link><pubDate>Thu, 27 Apr 2023 21:21:16 +0800</pubDate><guid>https://ual.sg/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Zhao T, Huang Z, Tu W, Biljecki F, Chen L (2023): Developing a multiview spatiotemporal model based on deep graph neural networks to predict the travel demand by bus. &lt;em&gt;International Journal of Geographical Information Science&lt;/em&gt;, 37(7): 1555-1581. &lt;a href="https://doi.org/10.1080/13658816.2023.2203218" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1080/13658816.2023.2203218&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-ijgis-bus-demand/2023-ijgis-bus-demand.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt;.
Congratulations on his great work! &amp;#x1f64c; &amp;#x1f44f;
Tianhong had been with us for a year as a visiting scholar from Shenzhen University, and &lt;a href="https://ual.sg/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/"&gt;he was awarded a prestigious scholarship&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The accurate prediction of travel demand by bus is crucial for effective urban mobility demand management. However, most models of travel demand prediction by bus tend to focus on the bus’s spatiotemporal dependencies, while ignoring the interactions between buses and other transportation modes, such as metros and taxis. We propose a Multiview Spatiotemporal Graph Neural Network (MSTGNN) model to predict short-term travel demand by bus. It emphasizes the ability to capture the interaction dependencies among the travel demand of buses, metros, and taxis. Firstly, a multiview graph consisting of bus, metro, and taxi views is constructed, with each view containing both a local and global graph. Secondly, a multiview attention-based temporal graph convolution module is developed to capture spatiotemporal and cross-view interaction dependencies among different transport modes. Especially, to address the uneven spatial distributions of features in multiview learning, the cross-view spatial feature consistency loss is introduced as an auxiliary loss. Finally, we conduct intensive experiments using a real-world dataset from Shenzhen, China. The results demonstrate that our proposed MSTGNN model performs better than the existing models. Ablation experiments validate the contributions of various modes of transportation to the improvement of the model’s performance.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/1_hu_aaf18589577422db.webp 400w,
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/1_hu_3f8a0ee760ffadeb.webp 760w,
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/1_hu_d10fdfb3d44a43cb.webp 1200w"
src="https://ual.sg/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/1_hu_aaf18589577422db.webp"
width="742"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/2_hu_a5478b4f2d3a3fba.webp 400w,
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/2_hu_b6a573f595d5aee4.webp 760w,
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/2_hu_f2478fab4c77628e.webp 1200w"
src="https://ual.sg/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/2_hu_a5478b4f2d3a3fba.webp"
width="760"
height="365"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-ijgis-bus-demand/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-ijgis-bus-demand/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/page-one_hu_b661556fb43a561e.webp 400w,
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/page-one_hu_9a72117b8634c04b.webp 760w,
/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/page-one_hu_7b2c11088ec7c859.webp 1200w"
src="https://ual.sg/post/2023/04/27/new-paper-developing-a-multiview-spatiotemporal-model-based-on-deep-graph-neural-networks-to-predict-the-travel-demand-by-bus/page-one_hu_b661556fb43a561e.webp"
width="488"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_ijgis_bus_demand&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Tianhong Zhao and Zhengdong Huang and Wei Tu and Filip Biljecki and Long Chen}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1080/13658816.2023.2203218}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Geographical Information Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Developing a multiview spatiotemporal model based on deep graph neural networks to predict the travel demand by bus}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{37}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{7}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1555-1581}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Integrating data on vegetation in digital twins - Takenaka Corporation and the Urban Analytics Lab</title><link>https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/</link><pubDate>Tue, 25 Apr 2023 12:17:28 +0800</pubDate><guid>https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/</guid><description>&lt;p&gt;We are relaying &lt;a href="https://www.linkedin.com/posts/nus-cde_urbansolutions-digitaltwins-nus-activity-7056460703796260864-vo18" target="_blank" rel="noopener"&gt;a release&lt;/a&gt; from our College of Design and Engineering about a significant milestone for us: a first large and funded collaboration with a company.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;In 2020, Assistant Professor &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; was awarded the NUS Presidential Young Professorship and formed the NUS Urban Analytics Lab in CDE’s Department of Architecture. His multidisciplinary research group focuses on urban data management and analysis, geographic data science, and digital twins. Digital twins map physical bodies into virtual spaces. The connected virtual replica allows researchers to collect intersecting dynamic data and test and optimise solutions from the virtual space to reality.&lt;/p&gt;
&lt;p&gt;Amongst the different research projects from his Lab, the work on integrating data on vegetation in digital twins caught the eye of Dr &lt;a href="https://ual.sg/author/kunihiko-fujiwara/"&gt;Kunihiko Fujiwara&lt;/a&gt;, Associate Chief Researcher at Takenaka Corporation, one of the major players in Japan&amp;rsquo;s construction, engineering, and architectural services sector. It intrigued Dr Fujiwara so much that he contacted Dr Biljecki to explore ways Takenaka could work closely with the Lab.&lt;/p&gt;
&lt;p&gt;Takenaka’s interest is in how such data can facilitate simulations to allow solutions towards climate change mitigation and thermal comfort. In short, how urban planners like Takenaka refine their urban solutions by integrating information on greenery to address the pressing issues arising from climate change.&lt;/p&gt;
&lt;p&gt;In just two years since the Lab’s formation, UAL has firmed its first industry partnership with a leading international corporation.&lt;/p&gt;
&lt;p&gt;Dr Fujiwara has joined UAL as a Visiting Research Fellow for a long-term period, and a PhD scholarship was set up geared towards such industry collaborations. For the next four years, this position will support the joint work between Takenaka and UAL. Funding from Takenaka to the Lab will also contribute towards this initiative.&lt;/p&gt;
&lt;p&gt;This is a step towards UAL’s work in extending its research towards direct application in close partnership with like-minded corporations such as Takenaka. The speed of Takenaka’s engagement with UAL is a positive signal of the common interests of the two parties to cross many divides towards working together on innovations for a sustainable built environment.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391330621_hu_337d515adf575348.webp 400w,
/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391330621_hu_6d2661ecfeb98b32.webp 760w,
/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391330621_hu_325c4d7d0b8629cb.webp 1200w"
src="https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391330621_hu_337d515adf575348.webp"
width="760"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391332979_hu_e85e1b82b91ab82e.webp 400w,
/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391332979_hu_b6411d4d3cdab8dd.webp 760w,
/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391332979_hu_6beec080cfc289fd.webp 1200w"
src="https://ual.sg/post/2023/04/25/integrating-data-on-vegetation-in-digital-twins-takenaka-corporation-and-the-urban-analytics-lab/1682391332979_hu_e85e1b82b91ab82e.webp"
width="760"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Automatic assessment of public open spaces using street view imagery</title><link>https://ual.sg/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/</link><pubDate>Sun, 23 Apr 2023 10:05:16 +0800</pubDate><guid>https://ual.sg/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen S, Biljecki F (2023): Automatic assessment of public open spaces using street view imagery. &lt;em&gt;Cities&lt;/em&gt; 137: 104329. &lt;a href="https://doi.org/10.1016/j.cities.2023.104329" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2023.104329&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-cities-pos/2023-cities-pos.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/shuting-chen/"&gt;Shuting Chen&lt;/a&gt;.
Congratulations on the great work and publication! &amp;#x1f64c; &amp;#x1f44f;
Shuting is now at the University of Hong Kong, where she started her PhD after graduating from our NUS Master of Urban Planning programme.&lt;/p&gt;
&lt;p&gt;Studies using street-level imagery have been confined to driveable roads.
Shuting is among the first ones to take advantage of &amp;lsquo;off-road&amp;rsquo; imagery &amp;ndash; a small but increasing volume of data taken in parks, trails, walkways, etc. to use them to assess public open spaces.
She has developed a new automated method to analyse 800 parks and other types of open spaces in Hong Kong and Singapore.
The findings suggest that such imagery may be instrumental for POS assessment as a new method, extending their research scope to rarely considered off-road areas, and contributing with a new approach for the design and allocation of POS in urban planning.&lt;/p&gt;
&lt;p&gt;Until 2023-06-11, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1gywVy5jOr5X7" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/1_hu_494eaba822d87697.webp 400w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/1_hu_90aecfa7e7de6acd.webp 760w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/1_hu_d356e2c65d2798b2.webp 1200w"
src="https://ual.sg/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/1_hu_494eaba822d87697.webp"
width="760"
height="610"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/2_hu_20092932018b5234.webp 400w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/2_hu_81902db45f534377.webp 760w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/2_hu_7c8b4fb0bccf1718.webp 1200w"
src="https://ual.sg/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/2_hu_20092932018b5234.webp"
width="760"
height="461"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/3_hu_40053a3958119591.webp 400w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/3_hu_c23fd0c1e89f1adf.webp 760w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/3_hu_ca67362a33906e4f.webp 1200w"
src="https://ual.sg/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/3_hu_40053a3958119591.webp"
width="666"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Public open space (POS) is essential to urban areas. Assessing them usually requires tedious approaches such as fieldwork and manual processes. Street View Imagery (SVI) and Computer Vision (CV) have been adopted in some urban environment research, bringing fine granularity and human perspective. However, limited aspects have been subject in these studies, and SVI and CV have not been used for holistic POS assessment. This research introduces a novel approach of employing them in conjunction with traditionally used geospatial and remote sensing data for automating POS assessment and doing so extensively. Indicators from both subjective and objective perspectives are developed, and CV algorithms are adopted for retrieving visual features. In a case study spanning 800 POS in Hong Kong and Singapore, a method is designed to predict both subjective and objective scores. The results demonstrate the perceptual models achieved acceptable to high accuracy scores, and suggest that SVI reflects different aspects of POS compared to previous approaches. The paper concludes that SVI can be adopted in POS assessment as a new instrument, extending their research scope to rarely considered off-road areas, and contributing with a new approach for the design and allocation of POS in urban planning.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Comprehensive assessment using street view imagery and computer vision.&lt;/li&gt;
&lt;li&gt;Uncovering the usability of rarely used street-level images beyond streetscapes.&lt;/li&gt;
&lt;li&gt;Developing and comparing objective and subjective perspectives for assessment.&lt;/li&gt;
&lt;li&gt;Combination of multi-sourced indicators reduces the bias of single data source.&lt;/li&gt;
&lt;li&gt;Imagery adds unparalleled insights to traditionally used GIS or remote sensing data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-cities-pos/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-cities-pos/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/page-one_hu_fd2e4bc85ad1342a.webp 400w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/page-one_hu_9da8bc650c0aa315.webp 760w,
/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/page-one_hu_2a321e21aa193fa5.webp 1200w"
src="https://ual.sg/post/2023/04/23/new-paper-automatic-assessment-of-public-open-spaces-using-street-view-imagery/page-one_hu_fd2e4bc85ad1342a.webp"
width="576"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_cities_pos&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Shuting Chen and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2023.104329}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104329}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Automatic assessment of public open spaces using street view imagery}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{137}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Quality of crowdsourced geospatial building information</title><link>https://ual.sg/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/</link><pubDate>Fri, 21 Apr 2023 22:00:16 +0800</pubDate><guid>https://ual.sg/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Chow YS, Lee K (2023): Quality of crowdsourced geospatial building information: A global assessment of OpenStreetMap attributes. &lt;em&gt;Building and Environment&lt;/em&gt; 237: 110295. &lt;a href="https://doi.org/10.1016/j.buildenv.2023.110295" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2023.110295&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-bae-osm-qa/2023-bae-osm-qa.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, and contributed by &lt;a href="https://ual.sg/author/yoong-shin-chow/"&gt;Yoong Shin Chow&lt;/a&gt; and &lt;a href="https://ual.sg/author/kay-lee/"&gt;Kay Lee&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Today, with more than 500 million buildings mapped, OpenStreetMap is the largest open dataset on the building stock.
The paper presents the first global study on understanding the content and quality of attributes of buildings.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/1_hu_fb83c78ece94c605.webp 400w,
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/1_hu_b88b6dadc63b5c25.webp 760w,
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/1_hu_2b1b53d2c348f009.webp 1200w"
src="https://ual.sg/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/1_hu_fb83c78ece94c605.webp"
width="760"
height="583"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/2_hu_3dd20a5fa123e309.webp 400w,
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/2_hu_7925e3980d12939b.webp 760w,
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/2_hu_f62c8278d6f4c0ab.webp 1200w"
src="https://ual.sg/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/2_hu_3dd20a5fa123e309.webp"
width="760"
height="702"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Geospatial data of the building stock is essential in many domains pertaining to the built environment. These datasets are often provided by governments, but crowdsourcing them has surged in the last decade. Nowadays, OpenStreetMap (OSM) — the most popular Volunteered Geographic Information (VGI) platform — contains geospatial and descriptive data on more than 500 million buildings worldwide collected by millions of contributors, and it is increasingly used in studies ranging from energy and microclimate to urban planning and life cycle assessment. However, large-scale understanding on their quality remains limited, which may hinder their use and management. In this paper, we seek to understand the state of building information in OSM and whether it is a reliable source of such data. We provide a comprehensive study to assess the quality of attribute (descriptive) data of the building stock mapped globally, e.g. building function, which are key ingredients in many analyses and simulations in the built environment. We examine three aspects: completeness, consistency, and accuracy. In this assessment, the first at such scale and the most comprehensive available hitherto, we find that quality continues to be highly heterogeneous — from poor quality in some, to very high completeness in other areas, potentially benefiting a range of application domains, e.g. we estimate that 3D building models of 443 administrative units (mostly cities and municipalities) around the world can be generated from OSM, underpinning the generation of digital twins. The number of floors and building type are the most frequent properties that contributors record, and in most cases are highly accurate, while mapping the interior of buildings did not gain momentum.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;OpenStreetMap is the principal global crowdsourced geospatial dataset.&lt;/li&gt;
&lt;li&gt;More than half a billion buildings are mapped in OpenStreetMap.&lt;/li&gt;
&lt;li&gt;Most comprehensive OpenStreetMap building attribute data quality assessment.&lt;/li&gt;
&lt;li&gt;Our global analysis provides an understanding of the highly variable quality.&lt;/li&gt;
&lt;li&gt;In thousands of districts, OSM building data may be sufficient for some use cases.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-bae-osm-qa/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-bae-osm-qa/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/page-one_hu_b148b349e5f9e651.webp 400w,
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/page-one_hu_bdac587d9b77ef67.webp 760w,
/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/page-one_hu_a5498eccbb7c59dd.webp 1200w"
src="https://ual.sg/post/2023/04/21/new-paper-quality-of-crowdsourced-geospatial-building-information/page-one_hu_b148b349e5f9e651.webp"
width="597"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_bae_osm_qa&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Filip Biljecki and Yoong Shin Chow and Kay Lee}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2023.110295}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{110295}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Quality of crowdsourced geospatial building information: A global assessment of OpenStreetMap attributes}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{237}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visits by academics from Germany, China, and Poland</title><link>https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/</link><pubDate>Fri, 21 Apr 2023 09:24:19 +0800</pubDate><guid>https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/</guid><description>&lt;p&gt;We are having a busy April &amp;ndash; we are happy to have hosted a few academic friends from overseas for short-term visits.&lt;/p&gt;
&lt;p&gt;First, it was a pleasure to host &lt;a href="https://milojevicdupontnikola.github.io" target="_blank" rel="noopener"&gt;Nikola Milojevic-Dupont&lt;/a&gt; (&lt;a href="https://www.mcc-berlin.net/" target="_blank" rel="noopener"&gt;Mercator Research Institute for Global Commons and Climate Change&lt;/a&gt; and &lt;a href="https://www.susturbecon.tu-berlin.de/sustainability_economics_of_human_settlements/" target="_blank" rel="noopener"&gt;Technical University Berlin&lt;/a&gt;).
He gave a lecture on Artificial intelligence for accelerating low-carbon urban planning.
In particular, the highlight was &lt;a href="https://eubucco.com" target="_blank" rel="noopener"&gt;EUBUCCO&lt;/a&gt;, the European building stock characteristics in a common and open database for 200+ million individual buildings (read the &lt;a href="https://doi.org/10.1038/s41597-023-02040-2" target="_blank" rel="noopener"&gt;paper&lt;/a&gt;), in which &lt;a href="https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/"&gt;we were involved&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a1_hu_b0c66d7e88ca1994.webp 400w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a1_hu_2f23157e2b743f67.webp 760w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a1_hu_611c21edf8225cc4.webp 1200w"
src="https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a1_hu_b0c66d7e88ca1994.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a2_hu_5646957eee88f23d.webp 400w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a2_hu_9d93a53664a16e89.webp 760w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a2_hu_58b551d4ecd09a2a.webp 1200w"
src="https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/a2_hu_5646957eee88f23d.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Second, we welcomed &lt;a href="https://huangweibuct.github.io/weihuang.github.io/" target="_blank" rel="noopener"&gt;Wei Huang&lt;/a&gt;, Professor of GIScience at &lt;a href="https://celiang.tongji.edu.cn/info/1300/2388.htm" target="_blank" rel="noopener"&gt;Tongji University&lt;/a&gt;.
He gave a great lecture on Urban Analytics and Social Sensing, and presented several multidisciplinary research projects conducted at his group in Shanghai.
This visit was facilitated by &lt;a href="https://profile.nus.edu.sg/fass/geoyy/" target="_blank" rel="noopener"&gt;Yingwei Yan&lt;/a&gt;, Lecturer and the Director of the &lt;a href="https://fass.nus.edu.sg/geog/msc-in-applied-gis/" target="_blank" rel="noopener"&gt;MSc in Applied GIS programme&lt;/a&gt; and our collaborator from the &lt;a href="https://fass.nus.edu.sg/geog/" target="_blank" rel="noopener"&gt;NUS Department of Geography&lt;/a&gt;.
Both of them are leading the &lt;a href="https://www2.isprs.org/commissions/comm4/wg6/" target="_blank" rel="noopener"&gt;ISPRS WG IV/6 Human Behaviour and Spatial Interactions&lt;/a&gt;, with which we are organising the workshop
&lt;a href="https://gsw2023.com/index.php/project/geohb-2023-geo-spatial-computing-for-understanding-human-behaviours/" target="_blank" rel="noopener"&gt;GeoHB 2023: Geo-Spatial Computing for Understanding Human Behaviours&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/b1_hu_6dd34ce7caa5121d.webp 400w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/b1_hu_895dc7a5fe6230fd.webp 760w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/b1_hu_6f1831aca4ad9c8c.webp 1200w"
src="https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/b1_hu_6dd34ce7caa5121d.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Finally, we hosted professors &lt;a href="https://bazawiedzy.upwr.edu.pl/info.seam?id=UPWr4d682756bd1243c58f310f8e07f263af&amp;amp;lang=en" target="_blank" rel="noopener"&gt;Witold Rohm&lt;/a&gt; and
&lt;a href="https://scholar.google.com.my/citations?user=4mvcBXQAAAAJ&amp;amp;hl=en" target="_blank" rel="noopener"&gt;Pawel Boguslawski&lt;/a&gt; from the &lt;a href="https://www.igig.up.wroc.pl/en/" target="_blank" rel="noopener"&gt;Institute of Geodesy and Geoinformatics&lt;/a&gt; at the &lt;a href="https://upwr.edu.pl" target="_blank" rel="noopener"&gt;Wroclaw University of Environmental and Life Sciences&lt;/a&gt;.
Witold is the Head of the Institute, and his expertise is in GNSS meteorology, remote sensing, and satellite geodesy.
Pawel is an expert in geoinformatics, and he is co-chairing the &lt;a href="https://www2.isprs.org/commissions/comm4/wg1/" target="_blank" rel="noopener"&gt;ISPRS WG IV/1 Spatial Data Representation and Interoperability&lt;/a&gt;, in which the PI of our Lab is involved as well (read more &lt;a href="https://ual.sg/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/"&gt;here in a recent blog post&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/c1_hu_749adae9d7c57ba0.webp 400w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/c1_hu_7a699768b015fd4e.webp 760w,
/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/c1_hu_a06feda925f44a4e.webp 1200w"
src="https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/c1_hu_749adae9d7c57ba0.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Our visitors gave guest lectures, held a series of 1:1 meetings, and participated in discussions.
Thanks for visiting us, and looking forward to future collaborations!&lt;/p&gt;</description></item><item><title>Filip Biljecki chairs the ISPRS Working Group on Spatial Data Representation and Interoperability</title><link>https://ual.sg/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/</link><pubDate>Wed, 19 Apr 2023 07:31:28 +0800</pubDate><guid>https://ual.sg/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/</guid><description>&lt;p&gt;The Director of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, has been appointed as Chair of the working group &lt;a href="https://www2.isprs.org/commissions/comm4/wg1/" target="_blank" rel="noopener"&gt;Spatial Data Representation and Interoperability&lt;/a&gt; at the &lt;a href="https://www.isprs.org/" target="_blank" rel="noopener"&gt;International Society for Photogrammetry and Remote Sensing (ISPRS)&lt;/a&gt;, a prominent international organisation in photogrammetry, remote sensing and spatial information sciences.
The organisation has been established more than a century ago and it is the oldest and most known umbrella society in this domain.&lt;/p&gt;
&lt;p&gt;This working group has been designated as the working group 1 within the &lt;a href="https://www2.isprs.org/commissions/comm4/" target="_blank" rel="noopener"&gt;ISPRS Technical Commission IV&lt;/a&gt;, which is on Spatial Information Science.&lt;/p&gt;
&lt;p&gt;In the ISPRS WG IV/1, Filip will collaborate with &lt;a href="http://www.noardo.eu" target="_blank" rel="noopener"&gt;Francesca Noardo&lt;/a&gt; from the Open Geospatial Consortium, &lt;a href="https://scholar.google.com.my/citations?user=4mvcBXQAAAAJ&amp;amp;hl=en" target="_blank" rel="noopener"&gt;Pawel Boguslawski&lt;/a&gt; from the Wroclaw University of Environmental and Life Sciences, and &lt;a href="https://www.polito.it/en/staff?p=elisabetta.colucci" target="_blank" rel="noopener"&gt;Elisabetta Colucci&lt;/a&gt; from Politecnico di Torino.&lt;/p&gt;
&lt;p&gt;The working group has a mandate until 2026, and has the following mission:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;This working group aims to promote research on spatial data representations, their standardisation, and interoperability. This topic is increasingly important with the proliferation of digital twins, evolvement of new and developing urban data sources such as 3D city models and street view imagery, and the increasing volume and use of volunteered geoinformation. The group aims to intensify the relationships with standardisation organisations such as OGC, to strengthen topics related to open science/data/software.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The working group has already a couple of activities, most prominently, the co-organisation of the workshop
&lt;a href="https://gsw2023.com/index.php/project/geohb-2023-geo-spatial-computing-for-understanding-human-behaviours/" target="_blank" rel="noopener"&gt;GeoHB 2023: Geo-Spatial Computing for Understanding Human Behaviours&lt;/a&gt;.
This event is organised in collaboration with Hao Li (TU Munich), Yingwei Yan (NUS Geography), Wei Huang (Tongji University) (&lt;a href="https://ual.sg/post/2023/04/21/visits-by-academics-from-germany-china-and-poland/"&gt;who was at our group last week&lt;/a&gt;), Yair Grinberger (HUJ), and Bi Yu Chen (Wuhan University), who are all part of other working groups within the same Commission.
The workshop is part of the &lt;a href="https://gsw2023.com" target="_blank" rel="noopener"&gt;ISPRS Geospatial Week 2023&lt;/a&gt; in Cairo, Egypt (September 2023).&lt;/p&gt;
&lt;p&gt;If you have a chance, please consider submitting a paper to this workshop.
Please see the poster below (note that the deadline has been extended).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/poster_hu_53f78a90aed29653.webp 400w,
/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/poster_hu_b9e905331d5a5dd5.webp 760w,
/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/poster_hu_b396c60f05f16ff.webp 1200w"
src="https://ual.sg/post/2023/04/19/filip-biljecki-chairs-the-isprs-working-group-on-spatial-data-representation-and-interoperability/poster_hu_53f78a90aed29653.webp"
width="538"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>AAG 2023 and visits in the United States</title><link>https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/</link><pubDate>Sun, 02 Apr 2023 14:03:28 +0800</pubDate><guid>https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/</guid><description>&lt;p&gt;We participated and contributed to the &lt;a href="https://www.aag.org/events/aag2023/" target="_blank" rel="noopener"&gt;2023 AAG Annual Meeting&lt;/a&gt; in Denver, Colorado, USA.
This is the flagship event of &lt;a href="https://www.aag.org" target="_blank" rel="noopener"&gt;The American Association of Geographers (AAG)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; has co-organised the GeoAI and Deep Learning Symposium: Urban Visual Intelligence and gave a talk on &amp;lsquo;Crowdsourced street view imagery and urban informatics&amp;rsquo;, while &lt;a href="https://ual.sg/author/mengbi-ye/"&gt;Mengbi Ye&lt;/a&gt; gave the talk &amp;lsquo;Analyzing the Mass Rapid Transit (MRT) passengers&amp;rsquo; spatio-temporal characteristics during Covid-19 using Singapore smart card data&amp;rsquo; at the Symposium on Human Dynamics Research: Lessons Learned From the COVID-19 Pandemic - Evolving Geospatial Methods and Perspectives of Human Mobility and Urban Dynamics Research.&lt;/p&gt;
&lt;p&gt;The NUS Urban Analytics Lab coorganised this session with &lt;a href="http://www.kkyyhh96.site/" target="_blank" rel="noopener"&gt;Yuhao Kang&lt;/a&gt; from the University of Wisconsin-Madison, &lt;a href="https://dusp.mit.edu/people/fabio-duarte" target="_blank" rel="noopener"&gt;Fábio Duarte&lt;/a&gt; from the Massachusetts Institute of Technology, and &lt;a href="https://www.ce.ust.hk/people/fan-zhang-zhangfan" target="_blank" rel="noopener"&gt;Fan Zhang&lt;/a&gt; from the Hong Kong University of Science and Technology.&lt;/p&gt;
&lt;p&gt;We thank everyone for attending our session and thank the contributors (&lt;a href="https://geo.msu.edu/directory/park-hyunseo.html" target="_blank" rel="noopener"&gt;Hyunseo Park&lt;/a&gt;, &lt;a href="https://www.linkedin.com/in/jiyoung-lee-698615200" target="_blank" rel="noopener"&gt;Jiyoung Lee&lt;/a&gt; and &lt;a href="https://www.keemoonjang.com" target="_blank" rel="noopener"&gt;Kee Moon Jang&lt;/a&gt;) for presenting their latest work.&lt;/p&gt;
&lt;p&gt;After the AAG conference, Filip visited the following groups &amp;amp; departments, where he gave guest lectures:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;University of Wisconsin &amp;ndash; Madison, Department of Geography, &lt;a href="https://geography.wisc.edu/geods/" target="_blank" rel="noopener"&gt;Geospatial Data Science Lab&lt;/a&gt; (&lt;a href="https://geography.wisc.edu/staff/gao-song/" target="_blank" rel="noopener"&gt;Prof Song Gao&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;University of Minnesota, Department of Computer Science and Engineering, &lt;a href="https://knowledge-computing.github.io" target="_blank" rel="noopener"&gt;Knowledge Computing Lab&lt;/a&gt; (&lt;a href="https://cse.umn.edu/cs/yao-yi-chiang" target="_blank" rel="noopener"&gt;Prof Yao-Yi Chiang&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;We look forward to collaborating with these wonderful research groups, and thank them for their hospitality.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-session-at-aag-on-urban-visual-intelligence"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Session at AAG on Urban Visual Intelligence" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-1_hu_516cc5d417aa41e3.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-1_hu_765c0064d6a92965.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-1_hu_5d8c67fed741ca12.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-1_hu_516cc5d417aa41e3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Session at AAG on Urban Visual Intelligence
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-session-at-aag-on-urban-visual-intelligence"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Session at AAG on Urban Visual Intelligence" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-2_hu_db7b3d2149f5c323.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-2_hu_583a71a85acd2255.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-2_hu_acdd9fd72a3cf37a.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-2_hu_db7b3d2149f5c323.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Session at AAG on Urban Visual Intelligence
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-session-at-aag-on-urban-visual-intelligence"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Session at AAG on Urban Visual Intelligence" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-3_hu_11369f0f655d645a.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-3_hu_29a6a91bd7a52e69.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-3_hu_2b444926a0ec598e.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-3_hu_11369f0f655d645a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Session at AAG on Urban Visual Intelligence
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-session-at-aag-on-urban-visual-intelligence"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Session at AAG on Urban Visual Intelligence" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-4_hu_5ed915b80a2c46e2.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-4_hu_d9637398781cb826.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-4_hu_5f236c83072c37b6.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/aag-4_hu_5ed915b80a2c46e2.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Session at AAG on Urban Visual Intelligence
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-session-at-aag-on-urban-visual-intelligence"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Session at AAG on Urban Visual Intelligence" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/UrbanVisualIntelligence_hu_be7beed647b092fe.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/UrbanVisualIntelligence_hu_7852ab2eff902c1d.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/UrbanVisualIntelligence_hu_40f271aa14a2779d.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/UrbanVisualIntelligence_hu_be7beed647b092fe.webp"
width="650"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Session at AAG on Urban Visual Intelligence
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-wisconsin----madison"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of Wisconsin -- Madison" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-1_hu_99340edc47aca768.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-1_hu_e8d28403236f370f.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-1_hu_763154346b943d46.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-1_hu_99340edc47aca768.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of Wisconsin &amp;ndash; Madison
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-wisconsin----madison"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of Wisconsin -- Madison" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-2_hu_7e7b4b77d10ae832.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-2_hu_34d591b48f5ccf7f.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-2_hu_827061fbb7ff1261.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-2_hu_7e7b4b77d10ae832.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of Wisconsin &amp;ndash; Madison
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-wisconsin----madison"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of Wisconsin -- Madison" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-3_hu_9c571bd4d99af8c3.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-3_hu_52f074e573d70397.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-3_hu_7878ae8fd5f8efcf.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-3_hu_9c571bd4d99af8c3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of Wisconsin &amp;ndash; Madison
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-wisconsin----madison"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of Wisconsin -- Madison" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-4_hu_3b3296a4ce69774.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-4_hu_c5523cf982bb8e2.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-4_hu_2f4d626ae00a7867.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/uwm-4_hu_3b3296a4ce69774.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of Wisconsin &amp;ndash; Madison
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-university-of-minnesota"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="University of Minnesota" srcset="
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/umn-1_hu_48822b0ab9305bfb.webp 400w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/umn-1_hu_7f59fabfe78ebf37.webp 760w,
/post/2023/04/02/aag-2023-and-visits-in-the-united-states/umn-1_hu_fb993a4e783e8165.webp 1200w"
src="https://ual.sg/post/2023/04/02/aag-2023-and-visits-in-the-united-states/umn-1_hu_48822b0ab9305bfb.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
University of Minnesota
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: European building stock characteristics in a common and open database</title><link>https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/</link><pubDate>Mon, 20 Mar 2023 21:21:16 +0800</pubDate><guid>https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Milojevic-Dupont N, Wagner F, Nachtigall F, Hu J, Brüser GB, Zumwald M, Biljecki F, Heeren N, Kaack LH, Pichler PP, Creutzig F (2023): EUBUCCO v0.1: European building stock characteristics in a common and open database for 200+ million individual buildings. &lt;em&gt;Scientific Data&lt;/em&gt; 10: 147. &lt;a href="https://doi.org/10.1038/s41597-023-02040-2" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1038/s41597-023-02040-2&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-sd-eubucco/2023-sd-eubucco.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;EUBUCCO (EUropean BUilding stock Characteristics in a Common and Open database for 206 million individual buildings) is a scientific database of individual building footprints for 200+ million buildings across the 27 European Union countries and Switzerland, together with three main attributes &amp;ndash; building type, height and construction year &amp;ndash; included for respectively 45%, 74%, 24% of the buildings.
EUBUCCO is composed of 50 open government datasets and OpenStreetMap that have been collected, harmonized and partly validated.&lt;/p&gt;
&lt;p&gt;EUBUCCO provides the basis for high-resolution urban sustainability studies across scales – continental, comparative or local studies – using a centralized source and is relevant for a variety of use cases, e.g. for energy system analysis or natural hazard risk assessments.&lt;/p&gt;
&lt;figure id="figure-the-50-input-datasets-parsed-to-generate-eubucco-v01-bold-font-indicates-country-level-datasets-while-normal-font-indicates-region--or-city-level-datasets-datasets-for-a-same-country-are-designated-with-different-tones-of-the-same-color-all-areas-where-openstreetmap-was-used-as-basis-for-the-building-footprints-are-colored-in-light-pink"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="The 50 input datasets parsed to generate Eubucco v0.1. Bold font indicates country-level datasets, while normal font indicates region- or city-level datasets. Datasets for a same country are designated with different tones of the same color. All areas where OpenStreetMap was used as basis for the building footprints are colored in light pink." srcset="
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/1_hu_2a2270cf6c499fba.webp 400w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/1_hu_607739fddf8aa3d7.webp 760w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/1_hu_a5dfd5cccdd7e432.webp 1200w"
src="https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/1_hu_2a2270cf6c499fba.webp"
width="760"
height="608"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
The 50 input datasets parsed to generate Eubucco v0.1. Bold font indicates country-level datasets, while normal font indicates region- or city-level datasets. Datasets for a same country are designated with different tones of the same color. All areas where OpenStreetMap was used as basis for the building footprints are colored in light pink.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The project was led by &lt;a href="https://milojevicdupontnikola.github.io" target="_blank" rel="noopener"&gt;Nikola Milojevic-Dupont&lt;/a&gt; and &lt;a href="https://www.mcc-berlin.net/en/about/team/wagner-felix.html" target="_blank" rel="noopener"&gt;Felix Wagner&lt;/a&gt; from &lt;a href="https://www.mcc-berlin.net/" target="_blank" rel="noopener"&gt;Mercator Research Institute for Global Commons and Climate Change&lt;/a&gt; and the Technical University Berlin (&lt;a href="https://www.susturbecon.tu-berlin.de/sustainability_economics_of_human_settlements/" target="_blank" rel="noopener"&gt;Chair of Sustainability Economics of Human Settlements&lt;/a&gt;).
It encompassed a large team including us (the only collaborator from outside Europe).
We look forward to more collaborations with Nikola and the rest of the team in Berlin and others.&lt;/p&gt;
&lt;figure id="figure-overview-of-the-processing-workflow-of-eubucco-v01"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Overview of the processing workflow of Eubucco v0.1." srcset="
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/2_hu_6bf79a58b0a323f.webp 400w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/2_hu_37f654dfc4473e2e.webp 760w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/2_hu_4ab804085b6efa7a.webp 1200w"
src="https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/2_hu_6bf79a58b0a323f.webp"
width="732"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Overview of the processing workflow of Eubucco v0.1.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Check out the website &lt;a href="https://eubucco.com" target="_blank" rel="noopener"&gt;here&lt;/a&gt; where you can find more information about the project and links to download the data.
The data is also archived on the scientific repository &lt;a href="https://zenodo.org/record/7225259" target="_blank" rel="noopener"&gt;Zenodo&lt;/a&gt;.
All the code used to generate this data is openly available on the &lt;a href="https://github.com/ai4up/eubucco" target="_blank" rel="noopener"&gt;Github repository of the project&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-illustration-of-the-attributes-present-in-eubucco-v01-the-three-maps-represent-buildings-footprints-and-the-buildings-attributes-present-in-the-database--type-height-and-construction-year--for-an-example-neighborhood-in-paris-while-the-footprint-shows-the-urban-morphology-of-the-neighborhood-the-attributes-enable-to-distinguish-further-contexts"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Illustration of the attributes present in Eubucco v0.1. The three maps represent buildings footprints and the buildings attributes present in the database – type, height and construction year – for an example neighborhood in Paris. While the footprint shows the urban morphology of the neighborhood, the attributes enable to distinguish further contexts." srcset="
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/3_hu_51b7a5868b4babc1.webp 400w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/3_hu_858edf3af9b8d355.webp 760w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/3_hu_5508d3132f37047.webp 1200w"
src="https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/3_hu_51b7a5868b4babc1.webp"
width="760"
height="209"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Illustration of the attributes present in Eubucco v0.1. The three maps represent buildings footprints and the buildings attributes present in the database – type, height and construction year – for an example neighborhood in Paris. While the footprint shows the urban morphology of the neighborhood, the attributes enable to distinguish further contexts.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The project is funded by the Climate Change Center Berlin Brandenburg, and CircEUlar project of the European Union’s Horizon Europe research and innovation program under grant agreement 101056810.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Building stock management is becoming a global societal and political issue, inter alia because of growing sustainability concerns. Comprehensive and openly accessible building stock data can enable impactful research exploring the most effective policy options. In Europe, efforts from citizen and governments generated numerous relevant datasets but these are fragmented and heterogeneous, thus hindering their usability. Here, we present eubucco v0.1, a database of individual building footprints for ~202 million buildings across the 27 European Union countries and Switzerland. Three main attributes – building height, construction year and type – are included for respectively 73%,
24% and 46% of the buildings. We identify, collect and harmonize 50 open government datasets and OpenStreetMap, and perform extensive validation analyses to assess the quality, consistency and completeness of the data in every country. eubucco v0.1 provides the basis for high-resolution urban sustainability studies across scales – continental, comparative or local studies – using a centralized source and is relevant for a variety of use cases, e.g., for energy system analysis or natural hazard risk assessments.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-sd-eubucco/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-sd-eubucco/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/page-one_hu_a89f901fbb849b70.webp 400w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/page-one_hu_377387aedf90e191.webp 760w,
/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/page-one_hu_5e158f5a32b27c52.webp 1200w"
src="https://ual.sg/post/2023/03/20/new-paper-european-building-stock-characteristics-in-a-common-and-open-database/page-one_hu_a89f901fbb849b70.webp"
width="578"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_sd_eubucco&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Milojevic-Dupont, Nikola and Wagner, Felix} and Nachtigall, Florian and Hu, Jiawei and Br{\&amp;#34;u}ser, Geza Boi and Zumwald, Marius and Biljecki, Filip and Heeren, Niko and Kaack, Lynn H. and Pichler, Peter-Paul and Creutzig, Felix}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1038/s41597-023-02040-2}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Scientific Data}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{147}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{EUBUCCO v0.1: European building stock characteristics in a common and open database for 200+ million individual buildings}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Knowledge and Topology</title><link>https://ual.sg/post/2023/03/17/new-paper-knowledge-and-topology/</link><pubDate>Fri, 17 Mar 2023 09:21:16 +0800</pubDate><guid>https://ual.sg/post/2023/03/17/new-paper-knowledge-and-topology/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Zhang Y, Liu P, Biljecki F (2023): Knowledge and topology: A two layer spatially dependent graph neural networks to identify urban functions with time-series street view image. &lt;em&gt;ISPRS Journal of Photogrammetry and Remote Sensing&lt;/em&gt; 198: 153-168. &lt;a href="https://doi.org/10.1016/j.isprsjprs.2023.03.008" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.isprsjprs.2023.03.008&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-ijprs-knowledge-topology/2023-ijprs-knowledge-topology.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/yan-zhang/"&gt;Yan Zhang&lt;/a&gt;.
Congratulations on his great work! &amp;#x1f64c; &amp;#x1f44f;
Yan had been with us for a year as a visiting scholar from Wuhan University, and &lt;a href="https://ual.sg/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/"&gt;he was awarded a prestigious scholarship&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Urban geo-tagged proximate sensing images are generated continuously in cities.
For example, they include street view imagery and photos shared on social media.
They may be considered as twin mappings of the city’s operational state, constantly &amp;lsquo;refreshing&amp;rsquo; for sensing urban areas.
Faced with a large volume of geo-tagged images, capturing the rich semantic information and the spatio-temporal location relationship is crucial to understanding and interpreting urban space.
In this paper, we propose a purely visual scheme for the functional perception of urban streets, which incorporates urban knowledge and road network topology and can fuse multiple source images to generate a holistic representation of a spatial unit.
We also incorporate temporal information and integrate historical street images to calculate urban spatial–temporal changes, renewal rates, and urban function transition matrix.
Using historical SVI data is a rarity in urban analytics, thus, the temporal aspect to sense urban changes and renewal is another major contribution of this work.&lt;/p&gt;
&lt;p&gt;The implementation has been open-sourced &lt;a href="https://github.com/yemanzhongting/Knowledge-and-Topology" target="_blank" rel="noopener"&gt;on Github&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Until 2023-05-05, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1glkC3I9x1mz9e" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/17/new-paper-knowledge-and-topology/1_hu_e5d723ff9db939ab.webp 400w,
/post/2023/03/17/new-paper-knowledge-and-topology/1_hu_cc871fee084a19d0.webp 760w,
/post/2023/03/17/new-paper-knowledge-and-topology/1_hu_9081d506ffbdd7cc.webp 1200w"
src="https://ual.sg/post/2023/03/17/new-paper-knowledge-and-topology/1_hu_e5d723ff9db939ab.webp"
width="760"
height="589"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/17/new-paper-knowledge-and-topology/2_hu_4c7d7d9084177b80.webp 400w,
/post/2023/03/17/new-paper-knowledge-and-topology/2_hu_36fb51e2cac81f2.webp 760w,
/post/2023/03/17/new-paper-knowledge-and-topology/2_hu_971135010cb5d98a.webp 1200w"
src="https://ual.sg/post/2023/03/17/new-paper-knowledge-and-topology/2_hu_4c7d7d9084177b80.webp"
width="760"
height="614"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;With the rise of GeoAI research, streetscape imagery has received extensive attention due to its comprehensive coverage, abundant information, and accessibility. However, obtaining a holistic spatial–temporal scene representation is difficult because places are often composed of multiple images from different angles, times and locations. This problem also exists in other types of geo-tagged imagery. To solve it, we propose a purely visual, robust, and reliable method for urban function identification at the street scale. We introduce a method based on a two-layer spatially dependent graph neural network structure, which handles sequential street view imagery as input (typically available in services such as Google Street View, Baidu Maps, and Mapillary), with full consideration of the spatial dependencies among road networks. In this paper, we construct an urban topological map network using OpenStreetMap data in Wuhan, China, and compute a semantic representation of the scene as a whole at the street scale using a large-scale pre-trained model. We construct the graph network with streets as nodes based on 28,693 mapping relationships constructed from 75,628 street view images and 5,458 streets. Only 5.3% of the node labels were required to obtain 10 categories of functions for all nodes in the study area. The results demonstrate that by using appropriate spatial weights, street encoder, and graph structure, our novel method achieves high accuracy of P@1 46.2%, P@3 73.0%, P@5 82.4%, and P@10 89.9%, fully demonstrating the effectiveness of the introduced approach. We also use the model to sense urban spatial–temporal renewal by computing time series street images. The model is also applicable to the prediction of other attributes, where only a small number of labels are required to obtain valid and reliable scene perception results. The example data and code is shared at: &lt;a href="https://github.com/yemanzhongting/Knowledge-and-Topology" target="_blank" rel="noopener"&gt;https://github.com/yemanzhongting/Knowledge-and-Topology&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-ijprs-knowledge-topology/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-ijprs-knowledge-topology/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/17/new-paper-knowledge-and-topology/page-one_hu_481015a042d26206.webp 400w,
/post/2023/03/17/new-paper-knowledge-and-topology/page-one_hu_7cdae612b8a4fbc5.webp 760w,
/post/2023/03/17/new-paper-knowledge-and-topology/page-one_hu_eedde3b9fa625d5a.webp 1200w"
src="https://ual.sg/post/2023/03/17/new-paper-knowledge-and-topology/page-one_hu_481015a042d26206.webp"
width="562"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_ijprs_knowledge_topology&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Knowledge and topology: A two layer spatially dependent graph neural networks to identify urban functions with time-series street view image}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Journal of Photogrammetry and Remote Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{198}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{153-168}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.isprsjprs.2023.03.008}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yan Zhang and Pengyuan Liu and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Visits to Indonesian universities</title><link>https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/</link><pubDate>Thu, 16 Mar 2023 13:54:49 +0800</pubDate><guid>https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/</guid><description>&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the research group during two recent visits to Indonesia where he gave lectures and participated in a variety of activities to further our network in Indonesia and Southeast Asia:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Universitas Gadjah Mada (UGM)&lt;/li&gt;
&lt;li&gt;Institut Teknologi Bandung (ITB)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These institutions are premier Indonesian universities in our domain.&lt;/p&gt;
&lt;p&gt;UGM, situated in Yogyakarta, is home to the &lt;a href="https://geodesi.ugm.ac.id/en/home/" target="_blank" rel="noopener"&gt;Department of Geodetic Engineering&lt;/a&gt;, and the visit was arranged by Prof &lt;a href="https://acadstaff.ugm.ac.id/triasaditya" target="_blank" rel="noopener"&gt;Trias Aditya&lt;/a&gt;, head of the department.
This visit coincided with the &lt;a href="https://geodesi.ugm.ac.id/geolandsea2023/" target="_blank" rel="noopener"&gt;South East Asia Workshop on Geodetic Data Sciences, Geoinformatics and Land Administration 2023&lt;/a&gt;, which was supported by FIG, TUM Global Incentive Fund and UGM, and co-organised by Prof &lt;a href="https://www.professoren.tum.de/en/de-vries-walter-timo" target="_blank" rel="noopener"&gt;Walter Timo de Vries&lt;/a&gt; from TU Munich.&lt;/p&gt;
&lt;p&gt;During the visit to ITB, Filip was hosted by &lt;a href="https://www.itb.ac.id/staf/profil/adiwan-fahlan-aritenang" target="_blank" rel="noopener"&gt;Adiwan Aritenang&lt;/a&gt;, programme director and his team at the &lt;a href="https://www.itb.ac.id/school-of-architecture-planning-and-policy-development" target="_blank" rel="noopener"&gt;School of Architecture, Planning, and Policy Development&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Both visits included guest lectures and discussions on collaboration, including education activities.&lt;/p&gt;
&lt;p&gt;The hospitality is very much appreciated, and we look forward to collaboration.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/16/visits-to-indonesian-universities/2_hu_c52dcb706cd2a025.webp 400w,
/post/2023/03/16/visits-to-indonesian-universities/2_hu_964cc564b2bf161f.webp 760w,
/post/2023/03/16/visits-to-indonesian-universities/2_hu_e87bcdcf456354cc.webp 1200w"
src="https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/2_hu_c52dcb706cd2a025.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/16/visits-to-indonesian-universities/3_hu_b36f8ed96fc3db19.webp 400w,
/post/2023/03/16/visits-to-indonesian-universities/3_hu_98fc02097a9e3ca4.webp 760w,
/post/2023/03/16/visits-to-indonesian-universities/3_hu_56b05f20159c0a7.webp 1200w"
src="https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/3_hu_b36f8ed96fc3db19.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/16/visits-to-indonesian-universities/4_hu_7e4626a3b8da3d86.webp 400w,
/post/2023/03/16/visits-to-indonesian-universities/4_hu_415adc0ef64e4461.webp 760w,
/post/2023/03/16/visits-to-indonesian-universities/4_hu_d89fc0406ad6a639.webp 1200w"
src="https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/4_hu_7e4626a3b8da3d86.webp"
width="608"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/16/visits-to-indonesian-universities/5_hu_ffcdddfebd9faee9.webp 400w,
/post/2023/03/16/visits-to-indonesian-universities/5_hu_4d49780888b1840f.webp 760w,
/post/2023/03/16/visits-to-indonesian-universities/5_hu_615fe50d6f10de71.webp 1200w"
src="https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/5_hu_ffcdddfebd9faee9.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/16/visits-to-indonesian-universities/6_hu_cc9c966572498148.webp 400w,
/post/2023/03/16/visits-to-indonesian-universities/6_hu_d6752df4990c7870.webp 760w,
/post/2023/03/16/visits-to-indonesian-universities/6_hu_2061be3bfc8dda61.webp 1200w"
src="https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/6_hu_cc9c966572498148.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/16/visits-to-indonesian-universities/7_hu_b4bb171890203d62.webp 400w,
/post/2023/03/16/visits-to-indonesian-universities/7_hu_7bac53751ad3ec44.webp 760w,
/post/2023/03/16/visits-to-indonesian-universities/7_hu_414fcf56b5b6a6db.webp 1200w"
src="https://ual.sg/post/2023/03/16/visits-to-indonesian-universities/7_hu_b4bb171890203d62.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Towards Human-centric Digital Twins: Leveraging Computer Vision and Graph Models to Predict Outdoor Comfort</title><link>https://ual.sg/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/</link><pubDate>Sat, 11 Mar 2023 17:11:16 +0800</pubDate><guid>https://ual.sg/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liu P, Zhao T, Luo J, Lei B, Frei M, Miller C, and Biljecki F (2023): Towards Human-centric Digital Twins: Leveraging Computer Vision and Graph Models to Predict Outdoor Comfort. &lt;em&gt;Sustainable Cities and Society&lt;/em&gt; 93: 104480. &lt;a href="https://doi.org/10.1016/j.scs.2023.104480" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.scs.2023.104480&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-scs-human-dt/2023-scs-human-dt.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;.
Congratulations on his continued successes and great work! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;In this paper, we developed a spatio-temporal-explicit GeoAI to predict human outdoor comfort, introducing a human-centric computational model for urban sidewalks.
We conceptualised the pedestrians and their interactions with surrounding built and unbuilt environments as human-centric dynamic graphs.
Our model captures such spatio-temporal variations given by the sequential movements of human walking.&lt;/p&gt;
&lt;p&gt;The implementation has been open-sourced on &lt;a href="https://github.com/PengyuanLiu1993/GSL-sidewalk-comfort" target="_blank" rel="noopener"&gt;Github&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Until 2023-04-30, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1gjz57sfVZAEfI" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/1_hu_c993fc78c8083d53.webp 400w,
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/1_hu_ed2f7063b252823e.webp 760w,
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/1_hu_cc1f31e08cbe8bf.webp 1200w"
src="https://ual.sg/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/1_hu_c993fc78c8083d53.webp"
width="760"
height="276"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/2_hu_939be52f04639b60.webp 400w,
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/2_hu_88f624b93efdf3a9.webp 760w,
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/2_hu_b5bfbf7873813d46.webp 1200w"
src="https://ual.sg/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/2_hu_939be52f04639b60.webp"
width="760"
height="665"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Conventional sidewalk studies focused on quantitative analysis of sidewalk walkability at a large scale which cannot capture the dynamic interactions between the environment and individual factors. Embracing the idea of Tech for Social Good, Urban Digital Twins seek AI-empowered approaches to bridge humans with digitally-mediated technologies to enhance their prediction ability. We employ GraphSAGE-LSTM, a geo-spatial artificial intelligence (GeoAI) framework on crowdsourced data and computer vision to predict human comfort on the sidewalks. Conceptualising the pedestrians and their interactions with surrounding built and unbuilt environments as human-centric dynamic graphs, our model captures such spatio-temporal variations given by the sequential movements of human walking, enabling the GraphSAGE-LSTM to be spatio-temporal-explicit. Our experiments suggest that the proposed model provides higher accuracy by more than 20% than a traditional machine learning model and two state-of-art deep learning frameworks, thus, enhancing the prediction power of Urban Digital Twin. The source code for the model is shared openly on GitHub.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-scs-human-dt/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-scs-human-dt/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/page-one_hu_d3534d11def1b099.webp 400w,
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/page-one_hu_3a781b5c77cd5ff1.webp 760w,
/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/page-one_hu_ba1636d3528798b0.webp 1200w"
src="https://ual.sg/post/2023/03/11/new-paper-towards-human-centric-digital-twins-leveraging-computer-vision-and-graph-models-to-predict-outdoor-comfort/page-one_hu_d3534d11def1b099.webp"
width="589"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_scs_human_dt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liu, Pengyuan and Zhao, Tianhong and Luo, Junjie and Lei, Binyu and Frei, Mario and Miller, Clayton and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.scs.2023.104480}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sustainable Cities and Society}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104480}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Towards Human-centric Digital Twins: Leveraging Computer Vision and Graph Models to Predict Outdoor Comfort}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{93}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Talk by Dr Yuhao Lu from the Singapore-ETH Centre</title><link>https://ual.sg/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/</link><pubDate>Wed, 08 Mar 2023 11:09:19 +0800</pubDate><guid>https://ual.sg/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/</guid><description>&lt;p&gt;Our Lab and department hosted Dr &lt;a href="https://fcl.ethz.ch/people/researchers/Yuhao-Lu.html" target="_blank" rel="noopener"&gt;Yuhao Lu&lt;/a&gt;, researcher and module coordinator at the &lt;a href="https://fcl.ethz.ch" target="_blank" rel="noopener"&gt;Future Cities Laboratory&lt;/a&gt;,
&lt;a href="https://sec.ethz.ch" target="_blank" rel="noopener"&gt;Singapore-ETH Centre&lt;/a&gt;, in which our Lab is involved through a &lt;a href="https://fcl.ethz.ch/research/integration-and-strategies/semantic-urban-elements.html" target="_blank" rel="noopener"&gt;new project&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We had the pleasure to have Yuhao in our &lt;a href="https://ual.sg/seminars"&gt;seminar series&lt;/a&gt; to give a talk titled &amp;lsquo;Map Outside the &lt;del&gt;Box&lt;/del&gt; Pixel&amp;rsquo;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://scholar.google.com/citations?user=VbGGSfUAAAAJ&amp;amp;hl=en&amp;amp;oi=ao" target="_blank" rel="noopener"&gt;Yuhao&amp;rsquo;s research&lt;/a&gt; was published in journals such as ISPRS Journal of Photogrammetry and Remote Sensing and Landscape and Urban Planning.&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/1_hu_bdf21ed05261db90.webp 400w,
/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/1_hu_52cfb1e1a823b58d.webp 760w,
/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/1_hu_abbc3f30ec3afe93.webp 1200w"
src="https://ual.sg/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/1_hu_bdf21ed05261db90.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/poster_hu_22fc0b4fd9ac5d29.webp 400w,
/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/poster_hu_759532177e576475.webp 760w,
/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/poster_hu_3b2550787f016eac.webp 1200w"
src="https://ual.sg/post/2023/03/08/talk-by-dr-yuhao-lu-from-the-singapore-eth-centre/poster_hu_22fc0b4fd9ac5d29.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Pixels (picture elements) are the smallest unit of raster-based maps.
Remote sensing and some geographic information analysis rely on pixels to store information, simulate changes, and communicate empirical findings. Pixels are a fundamental cartographic unit, essential to many if not all mapping exercises. However pixels can also pose constraints to our research and creative work, for example, due to their sizes (resolution), or the way they have been arranged and/or analysed. In this presentation, titled &amp;ldquo;Map Outside the &lt;del&gt;box&lt;/del&gt; Pixel&amp;rdquo;, I will share two projects and a few maps that used pixels rather creatively, unlocking new ways to conduct pixel-based geospatial research and creative design projects. These projects are technical by nature, but also creative in ways that I hope can speak to individuals who are not trained in geospatial sciences.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>Visit by Dr Yujin Park from Chung-Ang University</title><link>https://ual.sg/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/</link><pubDate>Tue, 28 Feb 2023 19:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/</guid><description>&lt;p&gt;Our Lab and department hosted Dr &lt;a href="http://planning.cau.ac.kr/01_info/sub02_view.php?gubun=1&amp;amp;seq=600" target="_blank" rel="noopener"&gt;Park Yujin&lt;/a&gt;, assistant professor at the &lt;a href="http://planning.cau.ac.kr" target="_blank" rel="noopener"&gt;Department of Urban Planning and Real Estate&lt;/a&gt;,
at the &lt;a href="https://www.cau.ac.kr/" target="_blank" rel="noopener"&gt;Chung-Ang University&lt;/a&gt; (Seoul, South Korea).&lt;/p&gt;
&lt;p&gt;Yujin&amp;rsquo;s research focuses on Environmental Planning, Green Infrastructure, 3D City Modeling &amp;amp; Big Data, and Urban Sustainability.
Her research was published in journals such as Computers, Environment and Urban Systems and Landscape and Urban Planning.&lt;/p&gt;
&lt;p&gt;During her stay, she delivered the lecture &lt;em&gt;Urban shade planning for thermally sustainable cities: perspectives from multi-sensor analysis&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/1_hu_4d4d6f29815770c1.webp 400w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/1_hu_faa69aff6f1f0baa.webp 760w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/1_hu_d31372aa971575a1.webp 1200w"
src="https://ual.sg/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/1_hu_4d4d6f29815770c1.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/2_hu_13040560b911a0b6.webp 400w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/2_hu_806979f8d2320c9f.webp 760w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/2_hu_2904cfde5e268b34.webp 1200w"
src="https://ual.sg/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/2_hu_13040560b911a0b6.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/3_hu_c68a97fd6fdf40e6.webp 400w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/3_hu_43dc6494c6763e25.webp 760w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/3_hu_a39c2bb23dda4e17.webp 1200w"
src="https://ual.sg/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/3_hu_c68a97fd6fdf40e6.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/poster_hu_c4b0d388f79b6b38.webp 400w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/poster_hu_5f641e9b298b141b.webp 760w,
/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/poster_hu_6b5f84f28a8d5af7.webp 1200w"
src="https://ual.sg/post/2023/02/28/visit-by-dr-yujin-park-from-chung-ang-university/poster_hu_c4b0d388f79b6b38.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Devising a nature- and design-based solution to combat climate challenges is a very important issue in urban planning to step up urban sustainability. Recent technological advances and the proliferation of big geospatial data allow for a more sophisticated digital representation of urban geometry in 3D. One underexplored but critical aspect is the impact of vertical urban features and their shading. Shading objects, including trees and buildings, are generally omnipresent in cities and can be utilized by urban planners to create effective local and regional plans for urban cooling. This talk addresses the interface of 3D land-use design and heat mitigation via diurnal shading, integrating geospatial analytics (3D GIS, spatial simulation and statistics) and remotely-sensed multi-resolution thermal data as a methodological tool. Based on a fine-resolution 3D model derived mainly from LiDAR, the relationship among land surface temperature, land cover composition, and shade characteristics are analyzed using statistical inference. The talk proposes challenges and research agendas that call for collaboration across domains (e.g. GIScience, energy, economics) to create sustainable land-use designs.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>CDE Awards and Recognition 2023</title><link>https://ual.sg/post/2023/02/27/cde-awards-and-recognition-2023/</link><pubDate>Mon, 27 Feb 2023 07:16:19 +0800</pubDate><guid>https://ual.sg/post/2023/02/27/cde-awards-and-recognition-2023/</guid><description>&lt;p&gt;Last week, faculty members from the &lt;a href="https://cde.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS College of Design and Engineering&lt;/a&gt; have been recognised for excellence in teaching and research at the CDE Awards and Recognition Ceremony for 2023.&lt;/p&gt;
&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; was awarded two honours:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Young Researcher Award 2023&lt;/li&gt;
&lt;li&gt;Teaching Excellence Award AY 2021/22&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;em&gt;(One of the few to be awarded for both research and teaching, affirming the excellence of our Lab.)&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Check out the &lt;a href="https://cde.nus.edu.sg/cde-awards-and-recognition-2023/" target="_blank" rel="noopener"&gt;full press release of the College&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Congratulations to all colleagues who have been recognised! &amp;#x1f44f;&lt;/p&gt;
&lt;figure id="figure-left-to-right-assoc-prof-simone-fatichi-cde-deputy-dean-prof-teo-kie-leong-asst-prof-filip-biljecki"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Left to right: Assoc Prof Simone Fatichi, CDE Deputy Dean Prof Teo Kie Leong, Asst Prof Filip Biljecki" srcset="
/post/2023/02/27/cde-awards-and-recognition-2023/young-researcher_hu_856c3a058cafcc57.webp 400w,
/post/2023/02/27/cde-awards-and-recognition-2023/young-researcher_hu_f3c3721a642499c5.webp 760w,
/post/2023/02/27/cde-awards-and-recognition-2023/young-researcher_hu_acefedc8d0075d56.webp 1200w"
src="https://ual.sg/post/2023/02/27/cde-awards-and-recognition-2023/young-researcher_hu_856c3a058cafcc57.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Left to right: Assoc Prof Simone Fatichi, CDE Deputy Dean Prof Teo Kie Leong, Asst Prof Filip Biljecki
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;figure id="figure-teaching-excellence-award-ay-202122"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Teaching Excellence Award AY 2021/22" srcset="
/post/2023/02/27/cde-awards-and-recognition-2023/teaching-excellence-1_hu_48e2cf9e923f5e95.webp 400w,
/post/2023/02/27/cde-awards-and-recognition-2023/teaching-excellence-1_hu_58b0813465ec201d.webp 760w,
/post/2023/02/27/cde-awards-and-recognition-2023/teaching-excellence-1_hu_a3d003c087e3b211.webp 1200w"
src="https://ual.sg/post/2023/02/27/cde-awards-and-recognition-2023/teaching-excellence-1_hu_48e2cf9e923f5e95.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Teaching Excellence Award AY 2021/22
&lt;/figcaption&gt;&lt;/figure&gt;</description></item><item><title>Congratulations to Dr Pengyuan Liu on a faculty position!</title><link>https://ual.sg/post/2023/02/26/congratulations-to-dr-pengyuan-liu-on-a-faculty-position/</link><pubDate>Sun, 26 Feb 2023 10:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/02/26/congratulations-to-dr-pengyuan-liu-on-a-faculty-position/</guid><description>&lt;p&gt;Dr &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;, who has been a Postdoctoral Research Fellow in our Lab, is now an Assistant Professor at the Nanjing University of Information Science &amp;amp; Technology.&lt;/p&gt;
&lt;p&gt;Pengyuan joined our &lt;a href="https://ual.sg/"&gt;NUS Urban Analytics Lab&lt;/a&gt; in early 2022 to work on a project on digital twins, after his PhD at the University of Leicester and research at the University of Helsinki.
At our group, he worked on several papers (check out his publications &lt;a href="https://scholar.google.com/citations?hl=en&amp;amp;user=XZXvFD0AAAAJ" target="_blank" rel="noopener"&gt;here&lt;/a&gt;) and mentored junior researchers.&lt;/p&gt;
&lt;p&gt;Pengyuan is the first member of our young research group to assume a faculty position, and it goes without saying that we are proud to have him as our Lab&amp;rsquo;s alumnus.
We thank him for the collaboration so far and we wish him all the best in his endeavours.
We are sure that he will have a great future in academia.&lt;/p&gt;
&lt;p&gt;Congratulations! &amp;#x1f44f;&lt;/p&gt;</description></item><item><title>Visit by Dr Jonathan Natanian, Head of EPDL</title><link>https://ual.sg/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/</link><pubDate>Fri, 03 Feb 2023 10:39:19 +0800</pubDate><guid>https://ual.sg/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/</guid><description>&lt;p&gt;Our Lab and department hosted Dr &lt;a href="https://jonathann.net.technion.ac.il" target="_blank" rel="noopener"&gt;Jonathan Natanian&lt;/a&gt;, assistant professor at the &lt;a href="https://architecture.technion.ac.il" target="_blank" rel="noopener"&gt;Faculty of Architecture and Town Planning&lt;/a&gt; at &lt;a href="http://www.technion.ac.il/en/" target="_blank" rel="noopener"&gt;Technion &amp;ndash; Israel Institute of Technology&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;At Technion, Jonathan established the &lt;a href="https://epdl.net.technion.ac.il" target="_blank" rel="noopener"&gt;EPDL - Environmental Performance and Design Lab&lt;/a&gt;, a research group that works on bridging the gap between architecture and environmental engineering, and operates by combining three main clusters which together form a triangle of knowledge: data acquisition, computational analysis, and environmental design interface.
EPDL focuses on establishing a bridge between those disciplines in a cross-contextual, multi-scale, and cross-disciplinary way.&lt;/p&gt;
&lt;p&gt;During his stay, he delivered the lecture &lt;em&gt;The Eco-Race in Architecture Caught Between Environmental Intuition and Intelligence&lt;/em&gt; (poster and abstract below).&lt;/p&gt;
&lt;p&gt;Thanks, and looking forward to future collaborations!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/2_hu_7e322fa599ce087a.webp 400w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/2_hu_f5e6b073079181ae.webp 760w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/2_hu_c6067c729eb478ee.webp 1200w"
src="https://ual.sg/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/2_hu_7e322fa599ce087a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/3_hu_5a0b02e00cda08d7.webp 400w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/3_hu_8098cd730bd6105c.webp 760w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/3_hu_f7234d5b07fc87a0.webp 1200w"
src="https://ual.sg/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/3_hu_5a0b02e00cda08d7.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/4_hu_2298a88321274119.webp 400w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/4_hu_f317a8d51137b0c7.webp 760w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/4_hu_66d09aa86e9c8635.webp 1200w"
src="https://ual.sg/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/4_hu_2298a88321274119.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/poster_hu_c8c10335137d63af.webp 400w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/poster_hu_de340eb234fbdbad.webp 760w,
/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/poster_hu_48cf12e9f49b9647.webp 1200w"
src="https://ual.sg/post/2023/02/03/visit-by-dr-jonathan-natanian-head-of-epdl/poster_hu_c8c10335137d63af.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract-of-the-lecture"&gt;Abstract of the lecture&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;The race towards sustainability in the built
environment is on!&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Zero carbon by 2050 and limiting global warming
to 1.5 C are just some of the ambitious
thresholds we need to get to - but how do we
get there by design?&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Many claim that we should get back to the
environmental architectural intuition we once had&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;but will it be enough? others push toward fully
-digitized environmental design workflows - but
what will happen then?&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;This lecture will discuss some of these questions
which revolve around the gaps between
environmental intuition and intelligence in
Architecture from personal, local, and global
eco-race perspectives.&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>New paper: Challenges of Urban Digital Twins</title><link>https://ual.sg/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/</link><pubDate>Thu, 05 Jan 2023 14:35:16 +0800</pubDate><guid>https://ual.sg/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Janssen P, Stoter J, Biljecki F (2023): Challenges of Urban Digital Twins: A Systematic Review and a Delphi Expert Survey. &lt;em&gt;Automation in Construction&lt;/em&gt; 147: 104716. &lt;a href="https://doi.org/10.1016/j.autcon.2022.104716" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.autcon.2022.104716&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-autcon-dt-challenges/2023-autcon-dt-challenges.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;.
Congratulations on her continued successes and great work! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;This publication is the latest one from her line of research on digital twins, following her recently published work on &lt;a href="https://ual.sg/publication/2022-ijgis-3-d-city-index/"&gt;assessing and benchmarking 3D city models&lt;/a&gt; (in &lt;a href="https://doi.org/10.1080/13658816.2022.2140808" target="_blank" rel="noopener"&gt;IJGIS&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;The paper presents a comprehensive overview of challenges to the adoption of digital twins at the urban scale.
The review was achieved with a dual method: a systematic literature review (primarily covering academia) and an expert survey (mainly focused on government and industry stakeholders) conducted according to the stringent Delphi method involving dozens of experts around the world.
About two dozen challenges, both technical and non-technical, have been identified, and a consensus on their severity has been reached.&lt;/p&gt;
&lt;p&gt;The research was conducted in collaboration with the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group at TU Delft&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/1_hu_4c9daed00b3c8187.webp 400w,
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/1_hu_f9c4e4da1f15883d.webp 760w,
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/1_hu_a16b3bf1771cae95.webp 1200w"
src="https://ual.sg/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/1_hu_4c9daed00b3c8187.webp"
width="685"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/2_hu_9ee8f39d63f3096b.webp 400w,
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/2_hu_b5bd2b1f74686353.webp 760w,
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/2_hu_195e4818c534343c.webp 1200w"
src="https://ual.sg/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/2_hu_9ee8f39d63f3096b.webp"
width="760"
height="623"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Digital twins have gained growing popularity in the urban and geospatial context.&lt;/li&gt;
&lt;li&gt;The adoption of digital twins is hindered by a variety of challenges.&lt;/li&gt;
&lt;li&gt;Identification and elaboration of challenges with a dual and rigorous approach.&lt;/li&gt;
&lt;li&gt;Structured list of 23 challenges to the operation of digital twins.&lt;/li&gt;
&lt;li&gt;Consensus among academic, industry, and government parties.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;Many challenges to operate digital twins remain, hindering their design and implementation, and are rarely discussed. Furthermore, issues of social and legal nature are often overlooked. We identify the challenges of operating digital twins in the urban context through a bifurcated and multi-dimensional approach: a systematic literature review and an expert survey. The review organises the identified challenges across technical and non-technical dimensions. As the topic is novel, the corpus is rather small and lacking the contextualisation of challenges. Thus, we complement it with a survey based on the Delphi method, involving a diverse panel of domain experts covering academia, industry and government organisations. Combining the results, we identify 14 technical and 9 non-technical challenges and map them to phases of the digital twin’s life cycle. The most severe challenges appear to be related to interoperability (e.g. disparate semantic standards) and practical value (e.g. lack of business models).&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-autcon-dt-challenges/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/page-one_hu_c3c878d3d50693dc.webp 400w,
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/page-one_hu_a886b36014cf6d05.webp 760w,
/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/page-one_hu_7678be73fea12bdd.webp 1200w"
src="https://ual.sg/post/2023/01/05/new-paper-challenges-of-urban-digital-twins/page-one_hu_c3c878d3d50693dc.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_autcon_dt_challenges&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Challenges of Urban Digital Twins: A Systematic Review and a Delphi Expert Survey}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Janssen, Patrick and Stoter, Jantien and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Automation in Construction}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.autcon.2022.104716}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104716}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{147}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Our research featured in a CNA documentary in the skies</title><link>https://ual.sg/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/</link><pubDate>Tue, 03 Jan 2023 14:08:19 +0800</pubDate><guid>https://ual.sg/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/</guid><description>&lt;p&gt;Spot us high in the sky! &amp;#x2708;&amp;#xfe0f;&lt;/p&gt;
&lt;p&gt;If you fly on Singapore Airlines, you may see us being featured in the CNA Documentary Series Innovating For The Future. &amp;#x1f60a;&lt;/p&gt;
&lt;p&gt;One of the episodes includes a variety of research done by &lt;a href="https://ual.sg/about/"&gt;us and our sister labs at NUS&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The documentary, which was developed in collaboration with NUS, looks at how diverse talents in the university community – each leaders and luminaries in their own areas of expertise – are catalysing positive change in Singapore and beyond.&lt;/p&gt;
&lt;p&gt;Catch this 10-part series – delving into topics like ageing, finance and food – to find out how NUS faculty, students and alumni are jointly creating a better world for the future.&lt;/p&gt;
&lt;p&gt;Photo credit: &lt;a href="https://ual.sg/author/mario-frei/"&gt;Mario Frei&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/1_hu_ae71dcf5678ef3f.webp 400w,
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/1_hu_ec456790018f339c.webp 760w,
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/1_hu_1f2de451d7247669.webp 1200w"
src="https://ual.sg/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/1_hu_ae71dcf5678ef3f.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/2_hu_953bcec2da2d92d0.webp 400w,
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/2_hu_6baba5ad3545945a.webp 760w,
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/2_hu_82955209c7044953.webp 1200w"
src="https://ual.sg/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/2_hu_953bcec2da2d92d0.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/3_hu_1125b19e22da89c8.webp 400w,
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/3_hu_1119ea0935f3a53c.webp 760w,
/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/3_hu_9adb9e4d1c6510e.webp 1200w"
src="https://ual.sg/post/2023/01/03/our-research-featured-in-a-cna-documentary-in-the-skies/3_hu_1125b19e22da89c8.webp"
width="760"
height="573"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Visit to Korea</title><link>https://ual.sg/post/2022/12/13/visit-to-korea/</link><pubDate>Tue, 13 Dec 2022 14:34:49 +0800</pubDate><guid>https://ual.sg/post/2022/12/13/visit-to-korea/</guid><description>&lt;p&gt;The PI of the Lab, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, represented the research group during a recent visit to Korea and participated in the following activities:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Keynote at the &lt;a href="https://seoulbigdataforum.kr/" target="_blank" rel="noopener"&gt;Seoul Big Data Forum 2022&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Lecture at the Seoul Metropolitan Government / Spatial Information Policy Division&lt;/li&gt;
&lt;li&gt;Lecture at the Chung-Ang University / &lt;a href="http://planning.cau.ac.kr/eng/" target="_blank" rel="noopener"&gt;Department of Urban Planning and Real Estate&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Lecture at the Yonsei University / &lt;a href="https://civil.yonsei.ac.kr/civil_en/index.do" target="_blank" rel="noopener"&gt;Department of Civil and Environmental Engineering&lt;/a&gt; &amp;amp; &lt;a href="http://scsi.yonsei.ac.kr/" target="_blank" rel="noopener"&gt;Spatial Computing for Sustainable Infrastructure Lab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Lecture at the Seoul National University / &lt;a href="https://gses.snu.ac.kr/en" target="_blank" rel="noopener"&gt;Graduate School of Environmental Studies&lt;/a&gt; &amp;amp; &lt;a href="http://cityenergylab.cafe24.com/" target="_blank" rel="noopener"&gt;City Energy Lab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Visit to &lt;a href="https://gaia3d.com/en/" target="_blank" rel="noopener"&gt;Gaia3D&lt;/a&gt;, a leading Korean company on digital twins &amp;amp; 3D modelling&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/13/visit-to-korea/1_hu_b23b4db11bc29e60.webp 400w,
/post/2022/12/13/visit-to-korea/1_hu_90712fc81248aa2e.webp 760w,
/post/2022/12/13/visit-to-korea/1_hu_479d1280aee0a142.webp 1200w"
src="https://ual.sg/post/2022/12/13/visit-to-korea/1_hu_b23b4db11bc29e60.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/13/visit-to-korea/2_hu_3a2570a5fbf9c797.webp 400w,
/post/2022/12/13/visit-to-korea/2_hu_dbbda61c7c8a8881.webp 760w,
/post/2022/12/13/visit-to-korea/2_hu_2e58911149c6518b.webp 1200w"
src="https://ual.sg/post/2022/12/13/visit-to-korea/2_hu_3a2570a5fbf9c797.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/13/visit-to-korea/3_hu_dd62ab2b9fa63eb0.webp 400w,
/post/2022/12/13/visit-to-korea/3_hu_94d4e8b55e8ea4df.webp 760w,
/post/2022/12/13/visit-to-korea/3_hu_3b36dd76a7e48193.webp 1200w"
src="https://ual.sg/post/2022/12/13/visit-to-korea/3_hu_dd62ab2b9fa63eb0.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: InstantCITY</title><link>https://ual.sg/post/2022/12/04/new-paper-instantcity/</link><pubDate>Sun, 04 Dec 2022 13:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/12/04/new-paper-instantcity/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wu AN, Biljecki F (2023): InstantCITY: Synthesising morphologically accurate geospatial data for urban form analysis, transfer, and quality control. &lt;em&gt;ISPRS Journal of Photogrammetry and Remote Sensing&lt;/em&gt; 195: 90-104. &lt;a href="https://doi.org/10.1016/j.isprsjprs.2022.11.005" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.isprsjprs.2022.11.005&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-ijprs-instantcity/2023-ijprs-instantcity.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt;.
Congratulations on his continued successes and great work! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;This work considerably builds on our previous work on &lt;a href="https://ual.sg/publication/2022-ijgis-ganmapper/"&gt;GANmapper &amp;ndash; Geographical Data Translation&lt;/a&gt; (published in &lt;a href="https://doi.org/10.1080/13658816.2022.2041643" target="_blank" rel="noopener"&gt;IJGIS&lt;/a&gt;).
We present an intrinsic mapping method and demonstrate that it is possible to map one feature from another, i.e. in this case we generate building footprints solely from road networks.&lt;/p&gt;
&lt;p&gt;The work has several use cases, for example:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Generating geospatial data where there is no data available or it is of poor completeness, but where there is good data of another &lt;em&gt;correlated&lt;/em&gt; phenomena available (e.g. roads to buildings).&lt;/li&gt;
&lt;li&gt;A new intrinsic method for spatial data quality control (we present a case with OpenStreetMap &amp;ndash; we use road networks to predict how many buildings should be in an area and compare it with the content of OSM, and flag areas with a substantial mismatch indicating poor completeness).&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The implementation has been released as open-source software at &lt;a href="https://github.com/ualsg/InstantCity" target="_blank" rel="noopener"&gt;our Github&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Until 2023-01-18, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1gACQ3I9x1j9Zf" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/04/new-paper-instantcity/1_hu_e62847e977cea9c1.webp 400w,
/post/2022/12/04/new-paper-instantcity/1_hu_50dd8a79ac250c39.webp 760w,
/post/2022/12/04/new-paper-instantcity/1_hu_da7c7d3f37a88b2f.webp 1200w"
src="https://ual.sg/post/2022/12/04/new-paper-instantcity/1_hu_e62847e977cea9c1.webp"
width="760"
height="293"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/04/new-paper-instantcity/2_hu_d75443f8524c4a27.webp 400w,
/post/2022/12/04/new-paper-instantcity/2_hu_96acfcfa9ec3b20e.webp 760w,
/post/2022/12/04/new-paper-instantcity/2_hu_703f085566aaacde.webp 1200w"
src="https://ual.sg/post/2022/12/04/new-paper-instantcity/2_hu_d75443f8524c4a27.webp"
width="760"
height="594"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/04/new-paper-instantcity/3_hu_ca2bcc26a1e54bf5.webp 400w,
/post/2022/12/04/new-paper-instantcity/3_hu_a0e96be4eeb915f8.webp 760w,
/post/2022/12/04/new-paper-instantcity/3_hu_c3a18f01c82a2378.webp 1200w"
src="https://ual.sg/post/2022/12/04/new-paper-instantcity/3_hu_ca2bcc26a1e54bf5.webp"
width="728"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Generative Adversarial Network (GAN) is widely used in many generative problems, including in spatial information sciences and urban systems. The data generated by GANs can achieve high quality to augment downstream training or to complete missing entries in a dataset. GANs can also be used to learn the relationship between two datasets and translate one into another, e.g. road network data into building footprint data. However, such approach has not been developed in the geospatial and urban data science context, its usability remains unknown, and the methods are not fully developed. We develop a new Geographical Data Translation algorithm based on GAN to generate high-resolution vector building data solely from street networks, which may be used to predict the urban morphology in absence of building data, also enabling studies in unmapped or undermapped urban geographies, among other advantages. Experiments on 16 cities around the world demonstrate that the generated datasets are largely successful in resembling ground truth morphologies. Thus, the approach may be used in lieu of traditional data for tasks that are often hampered by lack of data, e.g. urban form studies, simulation of urban morphologies in new contexts, and spatial data quality assessment. Our work proposes a novel rapid approach to generate building footprints in replacement of procedural methods and it introduces a new intrinsic method for large-scale spatial data quality control, which we test on OpenStreetMap by predicting missing buildings and suggesting the completeness of data without the usually required authoritative counterparts. The code, sample model, and dataset are available openly.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-ijprs-instantcity/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-ijprs-instantcity/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/12/04/new-paper-instantcity/page-one_hu_47e044ced675fb4d.webp 400w,
/post/2022/12/04/new-paper-instantcity/page-one_hu_28fd5a475c3f93f4.webp 760w,
/post/2022/12/04/new-paper-instantcity/page-one_hu_475775927f839bcb.webp 1200w"
src="https://ual.sg/post/2022/12/04/new-paper-instantcity/page-one_hu_47e044ced675fb4d.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_ijprs_instantcity&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wu, Abraham Noah and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.isprsjprs.2022.11.005}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Journal of Photogrammetry and Remote Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{90-104}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{InstantCITY: Synthesising morphologically accurate geospatial data for urban form analysis, transfer, and quality control}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{195}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>The PI of the Lab is on the list of top global 2% scientists</title><link>https://ual.sg/post/2022/11/25/the-pi-of-the-lab-is-on-the-list-of-top-global-2-scientists/</link><pubDate>Fri, 25 Nov 2022 10:52:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/25/the-pi-of-the-lab-is-on-the-list-of-top-global-2-scientists/</guid><description>&lt;p&gt;The PI of the &lt;a href="https://ual.sg/"&gt;NUS Urban Analytics Lab&lt;/a&gt;, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, was included on the Stanford list of global top 2% of scientists.
The NUS Business School covered this feature, so we relay their &lt;a href="https://bizbeat.nus.edu.sg/community-news/article/accolade-for-biz-faculty-listed-among-global-top-2-of-scientists/" target="_blank" rel="noopener"&gt;post&lt;/a&gt; below.&lt;/p&gt;
&lt;h3 id="accolade-for-biz-faculty-listed-among-global-top-2-of-scientists"&gt;Accolade for BIZ Faculty Listed among Global Top 2% of Scientists&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;NUS Business School’s professors have been recognised for their research impact—16 were named among the global top two per cent of scientists in a Stanford &lt;a href="https://elsevier.digitalcommonsdata.com/datasets/btchxktzyw/5" target="_blank" rel="noopener"&gt;study&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure id="figure-top-row-from-left-prof-thompson-teo-prof-remus-ilies-prof-andrew-k-rose-dean-prof-duan-jin-chuan-visiting-prof-michael-frese-second-row-prof-david-de-cremer-prof-wong-poh-kam-prof-vivien-lim-adjunct-prof-richard-d-arvey-prof-jochen-wirtz-prof-ho-teck-hua-provost-third-row-prof-chang-sea-jin-prof-andrew-delios-prof-sumit-agarwal-prof-mark-goh-asst-prof-filip-biljecki"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Top row from left: Prof Thompson Teo; Prof Remus Ilies; Prof Andrew K. Rose (Dean); Prof Duan Jin Chuan; Visiting Prof Michael Frese. Second row: Prof David de Cremer; Prof Wong Poh Kam; Prof Vivien Lim; Adjunct Prof Richard D. Arvey; Prof Jochen Wirtz; Prof Ho Teck Hua (Provost). Third row: Prof Chang Sea Jin; Prof Andrew Delios; Prof Sumit Agarwal; Prof Mark Goh; Asst Prof Filip Biljecki." srcset="
/post/2022/11/25/the-pi-of-the-lab-is-on-the-list-of-top-global-2-scientists/featured_hu_172d0aa2deff5d3e.webp 400w,
/post/2022/11/25/the-pi-of-the-lab-is-on-the-list-of-top-global-2-scientists/featured_hu_5de2b49b5c9fd84f.webp 760w,
/post/2022/11/25/the-pi-of-the-lab-is-on-the-list-of-top-global-2-scientists/featured_hu_a30f10d81d4ec5d6.webp 1200w"
src="https://ual.sg/post/2022/11/25/the-pi-of-the-lab-is-on-the-list-of-top-global-2-scientists/featured_hu_172d0aa2deff5d3e.webp"
width="760"
height="369"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Top row from left: Prof Thompson Teo; Prof Remus Ilies; Prof Andrew K. Rose (Dean); Prof Duan Jin Chuan; Visiting Prof Michael Frese. Second row: Prof David de Cremer; Prof Wong Poh Kam; Prof Vivien Lim; Adjunct Prof Richard D. Arvey; Prof Jochen Wirtz; Prof Ho Teck Hua (Provost). Third row: Prof Chang Sea Jin; Prof Andrew Delios; Prof Sumit Agarwal; Prof Mark Goh; Asst Prof Filip Biljecki.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The scientists were organised into 22 scientific fields and 174 sub-fields, with separate data shown for career-long and single-year (2021) impact. The top-cited scientists were selected based on standardised citation indicators and co-authorship of papers, among other criteria.
Our heartiest congratulations to the professors!&lt;/p&gt;
&lt;/blockquote&gt;</description></item><item><title>New paper: Mining real estate ads and property transactions for building and amenity data acquisition</title><link>https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/</link><pubDate>Thu, 24 Nov 2022 09:21:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen X, Biljecki F (2022): Mining real estate ads and property transactions for building and amenity data acquisition. &lt;em&gt;Urban Informatics&lt;/em&gt; 1: 12. &lt;a href="https://doi.org/10.1007/s44212-022-00012-2" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1007/s44212-022-00012-2&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ui-real-estate-mining/2022-ui-real-estate-mining.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/xinyu-chen/"&gt;Xinyu Chen&lt;/a&gt;.
Congratulations on her first journal paper! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/1_hu_a8e0fdb97dba063c.webp 400w,
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/1_hu_2a80f949ec7fc970.webp 760w,
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/1_hu_63943587a6606e35.webp 1200w"
src="https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/1_hu_a8e0fdb97dba063c.webp"
width="760"
height="478"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/2_hu_e87eee22b48ef942.webp 400w,
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/2_hu_a212ee2671720fba.webp 760w,
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/2_hu_c96374810488720b.webp 1200w"
src="https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/2_hu_e87eee22b48ef942.webp"
width="760"
height="414"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The journal is founded and managed by the &lt;a href="http://isocui.org/" target="_blank" rel="noopener"&gt;International Society for Urban Informatics&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Acquiring spatial data of fine and dynamic urban features such as buildings remains challenging. This paper brings attention to real estate advertisements and property sales data as valuable and dynamic sources of geoinformation in the built environment, but unutilised in spatial data infrastructures. Given the wealth of information they hold and their user-generated nature, we put forward the idea of real estate data as an instance of implicit volunteered geographic information and bring attention to their spatial aspect, potentially alleviating the challenge of acquiring spatial data of fine and dynamic urban features. We develop a mechanism of facilitating continuous acquisition, maintenance, and quality assurance of building data and associated amenities from real estate data. The results of the experiments conducted in Singapore reveal that one month of property listings provides information on 7% of the national building stock and about half of the residential subset, e.g. age, type, and storeys, which are often not available in sources such as OpenStreetMap, potentially supporting applications such as 3D city modelling and energy simulations. The method may serve as a novel means to spatial data quality control as it detects missing amenities and maps future buildings, which are advertised and transacted before they are built, but it exhibits mixed results in identifying unmapped buildings as ads may contain errors that impede the idea.nted water view imagery, and it is intended to support future studies.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ui-real-estate-mining/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ui-real-estate-mining/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/page-one_hu_fb03998e23eb9217.webp 400w,
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/page-one_hu_f2600c4f69282b2a.webp 760w,
/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/page-one_hu_5864867d85c794bf.webp 1200w"
src="https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/page-one_hu_fb03998e23eb9217.webp"
width="544"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ui_real_estate_mining&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Mining real estate ads and property transactions for building and amenity data acquisition}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Chen, Xinyu and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Urban Informatics}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1007/s44212-022-00012-2}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{12}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Sensing urban soundscapes from street view imagery</title><link>https://ual.sg/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/</link><pubDate>Mon, 21 Nov 2022 13:36:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Zhao T, Liang X, Tu W, Huang Z, Biljecki F (2023): Sensing urban soundscapes from street view imagery. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 99: 101915. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2022.101915" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2022.101915&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-ceus-soundscapes/2023-ceus-soundscapes.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt;.
Congratulations on his first journal paper in our Lab, great job! &amp;#x1f64c; &amp;#x1f44f;
Congratulations also to &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt; who also played an important role in the research.&lt;/p&gt;
&lt;p&gt;The work resulted in an open dataset: &lt;a href="https://github.com/ualsg/Visual-soundscapes" target="_blank" rel="noopener"&gt;Visual soundscapes&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Until 2023-01-09, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1g6y8jFQguwgw" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/1_hu_a357dae3d2bf1cf7.webp 400w,
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/1_hu_23fee0ab1814ed9a.webp 760w,
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/1_hu_f4af36e493cd8ef9.webp 1200w"
src="https://ual.sg/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/1_hu_a357dae3d2bf1cf7.webp"
width="760"
height="671"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Large-scale and high-resolution approach sensing urban soundscapes without ground measurements with machine learning.&lt;/li&gt;
&lt;li&gt;A dataset of street view imagery tagged with multivariate soundscape indicators.&lt;/li&gt;
&lt;li&gt;Quantifying the relationships between visual features and human soundscape perception.&lt;/li&gt;
&lt;li&gt;Validation with sound intensity and corresponding street view imagery using noise meters and cameras.&lt;/li&gt;
&lt;li&gt;Comparative analysis including two cities and a scalable approach.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/2_hu_791133e2588c0a7.webp 400w,
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/2_hu_f04a71cf0602bd64.webp 760w,
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/2_hu_d36e8709311e954c.webp 1200w"
src="https://ual.sg/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/2_hu_791133e2588c0a7.webp"
width="564"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;A healthy acoustic environment is an essential component of sustainable cities. Various noise monitoring and simulation techniques have been developed to measure and evaluate urban sounds. However, sensing large areas at a fine resolution remains a great challenge. Based on machine learning, we introduce a new application of street view imagery — estimating large-area high-resolution urban soundscapes, investigating the premise that we can predict and characterize soundscapes without laborious and expensive noise measurements. First, visual features are extracted from street-level imagery using computer vision. Second, fifteen soundscape indicators are identified and a survey is conducted to gauge them solely from images. Finally, a prediction model is constructed to infer the urban soundscape by modeling the non-linear relationship between them. The results are verified with extensive field surveys. Experiments conducted in Singapore and Shenzhen using half a million images affirm that street view imagery enables us to sense large-scale urban soundscapes with low cost but high accuracy and detail, and provides an alternative means to generate soundscape maps. R squared reaches 0.48 by evaluating the predicted results with field data collection. Further novelties in this domain are revealing the contributing visual elements and spatial laws of soundscapes, underscoring the usability of crowdsourced data, and exposing international patterns in perception.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-ceus-soundscapes/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-ceus-soundscapes/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/page-one_hu_8e4b4ce11911bb7d.webp 400w,
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/page-one_hu_7e9cf71c6e2e7e69.webp 760w,
/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/page-one_hu_bdc4eb30f1d3c22e.webp 1200w"
src="https://ual.sg/post/2022/11/21/new-paper-sensing-urban-soundscapes-from-street-view-imagery/page-one_hu_8e4b4ce11911bb7d.webp"
width="575"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_ceus_soundscapes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Zhao, Tianhong and Liang, Xiucheng and Tu, Wei and Huang, Zhengdong and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2022.101915}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{101915}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Sensing urban soundscapes from street view imagery}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{99}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Incorporating Networks in Semantic Understanding of Streetscapes</title><link>https://ual.sg/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/</link><pubDate>Sat, 19 Nov 2022 16:22:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Chang JH, Biljecki F (2023): Incorporating Networks in Semantic Understanding of Streetscapes: Contextualising Active Mobility Decisions. &lt;em&gt;Environment and Planning B: Urban Analytics and City Science&lt;/em&gt; 50(6): 1416-1437. &lt;a href="https://doi.org/10.1177/23998083221138832" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1177/23998083221138832&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-epb-semantic-networks/2023-epb-semantic-networks.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;.
Congratulations on his continued successes during his PhD! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/1_hu_eafe62f1e019bd55.webp 400w,
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/1_hu_7b5a2769c9f77d9d.webp 760w,
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/1_hu_7143a0d15c975b83.webp 1200w"
src="https://ual.sg/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/1_hu_eafe62f1e019bd55.webp"
width="760"
height="466"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/2_hu_726e1818baab7d36.webp 400w,
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/2_hu_8ab3e16a5216c6c4.webp 760w,
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/2_hu_41324290fde182d6.webp 1200w"
src="https://ual.sg/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/2_hu_726e1818baab7d36.webp"
width="760"
height="382"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Planning for active mobility satisfies many fundamental tenets of good urban design and planning. However, planning for active mobility is a complex endeavour due to numerous local, place-based factors that influence active mobility decisions. Recent advancements in urban data research have demonstrated the effectiveness of deep learning methods in evaluating active mobility potential for urban environments. At present, the incorporation of semantic information from deep learning models and street view imagery into spatio-temporal contexts remains a challenge. In particular, knowledge extraction from deep learning models remains an open question for urban planning and decision-making. Towards this issue, we propose a functional deep learning and network science workflow that employs open data from OpenStreetMap and Mapillary to assess factors affecting active mobility decisions and route planning. We demonstrate the generalisability of our analytical workflow through two case studies focusing on urban greenery in Nerima city (Japan) and urban visual complexity in Pasir Ris town (Singapore). Our results reveal clear patterns of heterogeneity in urban streetscapes and identify unevenness in street infrastructure provision based on destination types. Using this information, we propose specific areas for design intervention to improve active mobility planning. Our workflow is applicable for a diverse range of use cases making it relevant to a wide range of stakeholders, not limited to, urban researchers, policy makers and urban planners.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-epb-semantic-networks/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-epb-semantic-networks/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/page-one_hu_a7d4510ca11e5e98.webp 400w,
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/page-one_hu_b45e7ce14ab11506.webp 760w,
/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/page-one_hu_d186392e16678435.webp 1200w"
src="https://ual.sg/post/2022/11/19/new-paper-incorporating-networks-in-semantic-understanding-of-streetscapes/page-one_hu_a7d4510ca11e5e98.webp"
width="510"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_epb_semantic_networks&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yap, Winston and Chang, Jiat-Hwee and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1177/23998083221138832}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Environment and Planning B: Urban Analytics and City Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Incorporating Networks in Semantic Understanding of Streetscapes: Contextualising Active Mobility Decisions}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1416-1437}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{50}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{6}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: A comprehensive framework for evaluating the quality of street view imagery</title><link>https://ual.sg/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/</link><pubDate>Mon, 14 Nov 2022 18:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Hou Y, Biljecki F (2022): A comprehensive framework for evaluating the quality of street view imagery. &lt;em&gt;International Journal of Applied Earth Observation and Geoinformation&lt;/em&gt; 115: 103094. &lt;a href="https://doi.org/10.1016/j.jag.2022.103094" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.jag.2022.103094&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-jag-svi-quality/2022-jag-svi-quality.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;.
Congratulations on the first first-author paper! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The implementation was released openly: &lt;a href="https://github.com/ualsg/SVI-Quality-Checker" target="_blank" rel="noopener"&gt;Street View Imagery Quality Checker&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper is also the first one to provide a definition of the concept of SVI:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Street view imagery (SVI) is typically a sequence of geotagged, ground-level photographs taken along a trajectory, providing spatially continuous observation of its vicinity.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/1_hu_1e2a25b2c536542a.webp 400w,
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/1_hu_dfda54728e7a7004.webp 760w,
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/1_hu_1fa22458e4f9bca1.webp 1200w"
src="https://ual.sg/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/1_hu_1e2a25b2c536542a.webp"
width="760"
height="726"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;We propose the first comprehensive quality framework for street view imagery.&lt;/li&gt;
&lt;li&gt;Framework comprises 48 quality elements and may be applied to other image datasets.&lt;/li&gt;
&lt;li&gt;We implement partial evaluation for data in 9 cities, exposing varying quality.&lt;/li&gt;
&lt;li&gt;The implementation is released open-source and can be applied to other locations.&lt;/li&gt;
&lt;li&gt;We provide an overdue definition of street view imagery.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/2_hu_6dbd92234e42e178.webp 400w,
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/2_hu_e83ef86cd6b1ab0.webp 760w,
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/2_hu_ff6baa156c0f21c3.webp 1200w"
src="https://ual.sg/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/2_hu_6dbd92234e42e178.webp"
width="760"
height="381"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Street view imagery (SVI) is increasingly in competition with traditional remote sensing sources and assuming its domination in myriads of studies, mainly thanks to the omnipresence of commercial services such as Google Street View. Similar to other spatial data, SVI may be of variable quality and burdened with a variety of errors. Recently, this concern has been amplified with the rise of volunteered SVI such as Mapillary and KartaView, which – akin to other instances of Volunteered Geographic Information (VGI) – are of heterogeneous quality. However, unlike with many other forms of spatial data, there has not been much discussion about the quality of SVI datasets, let alone a standard and mechanism to assess them. Further, current spatial data quality standards are not entirely applicable to SVI due to its particularities. Following a multi-pronged method, we establish a comprehensive framework for describing and assessing the quality of SVI. We present a categorised set of 48 elements that suggest the quality of imagery and associated data such as geographic information and metadata. The framework is applicable to any source of SVI, including both commercial and crowdsourcing services. In the implementation, which we release open-source, we assess several quality elements of SVI datasets across 9 cities. The results expose varying quality of SVI and affirm the importance of the work. Given the exponential volume of studies taking advantage of SVI, but largely overlooking quality aspects, this work is a timely contribution that will benefit data providers, contributors, and users. It may also be applied on other forms of image-based VGI, and underpin establishing a formal international standard in the future. On a broader perspective, while providing an overdue definition of SVI, this work also reveals issues and open questions that impede delineating and assessing this diverse form of urban and terrestrial imagery.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-jag-svi-quality/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-jag-svi-quality/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/page-one_hu_53815b9b44ef85a0.webp 400w,
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/page-one_hu_90e018bbb70e3483.webp 760w,
/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/page-one_hu_82a03500b48600e2.webp 1200w"
src="https://ual.sg/post/2022/11/14/new-paper-a-comprehensive-framework-for-evaluating-the-quality-of-street-view-imagery/page-one_hu_53815b9b44ef85a0.webp"
width="567"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_jag_svi_quality&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{A comprehensive framework for evaluating the quality of street view imagery}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Hou, Yujun and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Applied Earth Observation and Geoinformation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.jag.2022.103094}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{103094}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{115}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Assessing and benchmarking 3D city models</title><link>https://ual.sg/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/</link><pubDate>Thu, 10 Nov 2022 16:24:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lei B, Stouffs R, Biljecki F (2023): Assessing and benchmarking 3D city models. &lt;em&gt;International Journal of Geographical Information Science&lt;/em&gt;, 37(4): 788-809. &lt;a href="https://doi.org/10.1080/13658816.2022.2140808" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1080/13658816.2022.2140808&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-ijgis-3-d-city-index/2023-ijgis-3-d-city-index.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;.
Congratulations on her first journal paper! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The project has a website: &lt;a href="https://ual.sg/project/3d-city-index/"&gt;3D City Index&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/1_hu_d72c3ee55a33d2e9.webp 400w,
/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/1_hu_cf7a4505a6502431.webp 760w,
/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/1_hu_ce7efcab4c5fa7af.webp 1200w"
src="https://ual.sg/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/1_hu_d72c3ee55a33d2e9.webp"
width="736"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;3D city models are omnipresent in urban management and simulations. However, instruments for their evaluation have been limited. Furthermore, current instances are scattered worldwide and developed independently, hampering their comparison and understanding practices. While there are developed assessment frameworks in open data, such efforts are generic and not applied to geospatial data. We establish a holistic and comprehensive four-category framework ‘3D City Index’, encompassing 47 criteria to identify key properties of 3D city models, enabling their assessment and benchmarking, and suggesting usability. We evaluate 40 authoritative 3D city models and derive quantitative and qualitative insights. The framework implementation enables a comprehensive and structured understanding of the landscape of semantic 3D geospatial data, as well as doubles as an evaluated collection of open 3D city models. For example, datasets differ substantially in their characteristics, having heterogeneous properties influenced by their different purposes. There are further applications of this first endeavour to standardise the characterisation of 3D data: monitoring developments and trends in 3D city modelling, and enabling researchers and practitioners to find the most appropriate datasets for their needs. The work is designed to measure datasets continuously and can also be applied to other instances in spatial data infrastructures.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-ijgis-3-d-city-index/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-ijgis-3-d-city-index/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/page-one_hu_b7c3a835d68c7667.webp 400w,
/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/page-one_hu_351371aab6cba5f4.webp 760w,
/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/page-one_hu_6febcafead6af50d.webp 1200w"
src="https://ual.sg/post/2022/11/10/new-paper-assessing-and-benchmarking-3d-city-models/page-one_hu_b7c3a835d68c7667.webp"
width="495"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_ijgis_3d_city_index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lei, Binyu and Stouffs, Rudi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1080/13658816.2022.2140808}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Geographical Information Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Assessing and benchmarking 3D city models}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{788-809}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{37}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{4}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Water View Imagery</title><link>https://ual.sg/post/2022/11/05/new-paper-water-view-imagery/</link><pubDate>Sat, 05 Nov 2022 18:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/05/new-paper-water-view-imagery/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Luo J, Zhao T, Cao L, Biljecki F (2022): Water View Imagery: Perception and evaluation of urban waterscapes worldwide. &lt;em&gt;Ecological Indicators&lt;/em&gt; 145: 109615. &lt;a href="https://doi.org/10.1016/j.ecolind.2022.109615" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.ecolind.2022.109615&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ei-water-view-imagery/2022-ei-water-view-imagery.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt;.
Congratulations on another journal paper in our Lab during the one-year research visit, great job! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The work resulted in an open dataset: &lt;a href="https://github.com/ualsg/Water-View-Imagery-dataset" target="_blank" rel="noopener"&gt;Water View Imagery&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/05/new-paper-water-view-imagery/1_hu_6d5f8fecc149ea7c.webp 400w,
/post/2022/11/05/new-paper-water-view-imagery/1_hu_dd3d67affcc5123.webp 760w,
/post/2022/11/05/new-paper-water-view-imagery/1_hu_64c10caa349c4866.webp 1200w"
src="https://ual.sg/post/2022/11/05/new-paper-water-view-imagery/1_hu_6d5f8fecc149ea7c.webp"
width="760"
height="390"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Investigating the concept and usability of water-level imagery.&lt;/li&gt;
&lt;li&gt;Comparative analysis of waterscapes in 16 cities from water-level imagery.&lt;/li&gt;
&lt;li&gt;A comprehensive perception study of multiple dimensions of waterfronts.&lt;/li&gt;
&lt;li&gt;An extensible and scalable large-scale evaluation index system of urban waterscapes.&lt;/li&gt;
&lt;li&gt;Open dataset supporting future studies, suited for computer vision.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/05/new-paper-water-view-imagery/2_hu_84c8305bc06b0404.webp 400w,
/post/2022/11/05/new-paper-water-view-imagery/2_hu_e1fe4ca8fcbe0400.webp 760w,
/post/2022/11/05/new-paper-water-view-imagery/2_hu_f641aa5bd800c9db.webp 1200w"
src="https://ual.sg/post/2022/11/05/new-paper-water-view-imagery/2_hu_84c8305bc06b0404.webp"
width="760"
height="443"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Gathering knowledge about physical settings and visual information of places has long been of interest to a wide variety of fields as they affect the experience of observers. Previous studies have relied on on-site surveys, low-throughput methods, and limited data sources, which especially hinder analyzing waterscape features. Thus, detecting the relationships between the human perception results of large-scale urban water areas and the waterfront features at high spatial resolutions remains challenging, and worldwide studies have not been conducted. We investigate an alternative: a data-driven waterscapes evaluation approach based on computer vision (CV) to analyze water view imagery (WVI) in 16 cities around the world and measure how people perceive scenes using virtual reality (VR). We bring attention to WVI – the counterpart of street view imagery (SVI) on water bodies, which is readily available for many cities thanks to the usual SVI services, but has been entirely overlooked in research hitherto. Specifically, a deep learning model, which has been trained with 500 segmented water-level photos, was developed to analyze them, achieving the mean pixel accuracy (MPA) of 94%, which advances state of the art. These panoramic images have been assessed through a virtual experience survey in which 60 participants indicated their perceptions across multiple dimensions. Afterwards, a series of statistical analyses were conducted to determine the visual indicators that drive perceptions, and the relationship between the people’s subjective visual perceptions and objective waterscape environment as seen by machines has been established. The results take researchers and watercourse planners one step toward understanding the interactions of the perceptions and semantics of water areas globally. The large-scale dataset we produced in this research has been released openly as the first such instance of open segmented water view imagery, and it is intended to support future studies.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/05/new-paper-water-view-imagery/3_hu_894c381210cc922c.webp 400w,
/post/2022/11/05/new-paper-water-view-imagery/3_hu_940fd6df4abbfb27.webp 760w,
/post/2022/11/05/new-paper-water-view-imagery/3_hu_7c2d3c8d5f03205b.webp 1200w"
src="https://ual.sg/post/2022/11/05/new-paper-water-view-imagery/3_hu_894c381210cc922c.webp"
width="760"
height="586"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ei-water-view-imagery/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ei-water-view-imagery/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/05/new-paper-water-view-imagery/page-one_hu_6cdd012fe039c77e.webp 400w,
/post/2022/11/05/new-paper-water-view-imagery/page-one_hu_60ab2d0a3940b1e6.webp 760w,
/post/2022/11/05/new-paper-water-view-imagery/page-one_hu_6d0320ad79030404.webp 1200w"
src="https://ual.sg/post/2022/11/05/new-paper-water-view-imagery/page-one_hu_6cdd012fe039c77e.webp"
width="616"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ei_water_view_imagery&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Luo, Junjie and Zhao, Tianhong and Cao, Lei and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.ecolind.2022.109615}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ecological Indicators}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{109615}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Water View Imagery: Perception and evaluation of urban waterscapes worldwide}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{145}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Release of a video feature about us!</title><link>https://ual.sg/post/2022/11/01/release-of-a-video-feature-about-us/</link><pubDate>Tue, 01 Nov 2022 11:17:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/01/release-of-a-video-feature-about-us/</guid><description>&lt;p&gt;We are excited to share a video about our research group produced by our &lt;a href="https://cde.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS College of Design and Engineering&lt;/a&gt;!&lt;/p&gt;
&lt;div
style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;"&gt;
&lt;iframe
src="https://player.vimeo.com/video/764033095?dnt=0"
style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allow="fullscreen"&gt;
&lt;/iframe&gt;
&lt;/div&gt;
&lt;hr&gt;
&lt;p&gt;Besides the &lt;a href="https://ual.sg/authors/filip/"&gt;PI&lt;/a&gt; of the Lab, the video features PhD researchers &lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;, &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;, &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;, and &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The video &lt;em&gt;Building smarter, more sustainable, liveable cities: The Urban Analytics Lab&lt;/em&gt; is also available on &lt;a href="https://vimeo.com/764033095" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Many thanks to our College for supporting and amplifying the stuff we do.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Do you want to join us?
The deadline for the next intake for the &lt;a href="https://ual.sg/opportunities/phd"&gt;PhD programme&lt;/a&gt; at our department is approaching.&lt;/p&gt;</description></item><item><title>New paper: Food production potential of high-rise public housing apartment buildings</title><link>https://ual.sg/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/</link><pubDate>Tue, 01 Nov 2022 10:51:16 +0800</pubDate><guid>https://ual.sg/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper in which we were involved:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Song S, Cheong JC, Lee JSH, Tan JKN, Chiam Z, Arora S, Png KJQ, Seow JWC, Leong FWS, Palliwal A, Biljecki F, Tablada A, Tan HTW (2022): Home gardening in Singapore: A feasibility study on the utilization of the vertical space of retrofitted high-rise public housing apartment buildings to increase urban vegetable self-sufficiency. &lt;em&gt;Urban Forestry &amp;amp; Urban Greening&lt;/em&gt;, 78: 127755. &lt;a href="https://doi.org/10.1016/j.ufug.2022.127755" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.ufug.2022.127755&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ufug-homegardening/2022-ufug-homegardening.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Congratulations to &lt;a href="https://www.researchgate.net/profile/Shuang-Song-33" target="_blank" rel="noopener"&gt;Dr Song Shuang&lt;/a&gt; from &lt;a href="https://www.dbs.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS Biological Sciences&lt;/a&gt; for the publication of this forward-looking work, and thanks for this productive and exciting collaboration!&lt;/p&gt;
&lt;p&gt;This experimental work, conducted in a high-rise residential building in Singapore, demonstrates that with proper optimisations, about half of the vegetable needs of the residents living in a building of such a typology can be met by food production in the building alone.
In this multidisciplinary collaboration, 3D building models have been used to facilitate the process (based on the work by &lt;a href="https://ual.sg/author/ankit-palliwal/"&gt;Ankit Palliwal&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-ceus-3-d-farming/"&gt;published in CEUS in 2021&lt;/a&gt;, introducing a new use case of 3D city models in urban farming).&lt;/p&gt;
&lt;p&gt;The work supports &lt;a href="https://www.ourfoodfuture.gov.sg/30by30" target="_blank" rel="noopener"&gt;Singapore&amp;rsquo;s &amp;lsquo;30 by 30&amp;rsquo; initiative &amp;ndash; aspiring to develop the agri-food industry&amp;rsquo;s capability and capacity to produce 30% of the city-state&amp;rsquo;s nutritional needs locally and sustainably by 2030&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;This line of research also sets the scene for investigating the use of digital twins in supporting urban farming, which may be especially relevant in land scarce urban areas such as Singapore (and many others), and against the backdrop of supply chain disruptions that may impede food security.&lt;/p&gt;
&lt;p&gt;Until 2022-12-09, the article is available for free via &lt;a href="https://authors.elsevier.com/c/1fy7f5m5d7vrWj" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/1_hu_172e461d7b8c00c5.webp 400w,
/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/1_hu_a7de804008276bbe.webp 760w,
/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/1_hu_32579df6a561eb46.webp 1200w"
src="https://ual.sg/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/1_hu_172e461d7b8c00c5.webp"
width="760"
height="452"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;In land-scarce cities, high-rise apartment buildings may provide vertical spaces for natural-light home gardening along corridors, rooftops, balconies as well as façades. The vertical space can improve not only urban environmental sustainability but also food security. Using an experimental approach, we investigated the food production potential of a high-rise public housing apartment building based on different gardening systems, food crops, and sunlight availability. A gardening prototype system for building corridors was shown to increase the unit area yield of corridor gardening by fivefold compared to a commercial trough planter system. Additionally, this commercial trough planter system was mainly for leafy vegetable production, whereas the gardening prototype system for corridors is also suitable for climbing crops, such as legumes and cucurbits. Nevertheless, because of the limited space along corridors of the apartment building and the relatively low-light levels on average, corridor gardening was estimated to meet only 0.5 % of the demand for vegetables of the residents living in the apartment building. Rooftop gardening with shallow growing medium (depth &amp;lt; 15 cm) was estimated to meet 3 % of demand, and façade gardening 43 %, given the larger space available. Although the vegetable production potential in this study was estimated based on a particular typology of public housing apartment buildings in Singapore, our results showed that vegetable production in public housing apartment buildings is feasible, and home gardening can produce a substantial amount of vegetables for consumption if well deployed. Governments of highly urbanized cities may wish to invest in better home garden designs for high-rise public housing apartment buildings and encourage residents’ participation in home gardening, which would increase high-rise greenery coverage and improve urban food system resilience. Future studies should also investigate the environmental sustainability and food safety aspects of home gardening in highly urbanized cities.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ufug-homegardening/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ufug-homegardening/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/page-one_hu_a7cb12e6d63a40dc.webp 400w,
/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/page-one_hu_5bdb7a0029e7207d.webp 760w,
/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/page-one_hu_ec8f308a4d91015.webp 1200w"
src="https://ual.sg/post/2022/11/01/new-paper-food-production-potential-of-high-rise-public-housing-apartment-buildings/page-one_hu_a7cb12e6d63a40dc.webp"
width="592"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ufug_homegardening&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Home gardening in Singapore: A feasibility study on the utilization of the vertical space of retrofitted high-rise public housing apartment buildings to increase urban vegetable self-sufficiency}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Shuang Song and Jia Chin Cheong and Joel S.H. Lee and Jonathan K.N. Tan and Zhongyu Chiam and Srishti Arora and Karl J.Q. Png and Johanah W.C. Seow and Felicia W.S. Leong and Ankit Palliwal and Filip Biljecki and Abel Tablada and Hugh T.W. Tan}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Urban Forestry \&amp;amp; Urban Greening}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{78}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{127755}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.ufug.2022.127755}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>All of our recent visiting scholars have been awarded prestigious scholarships</title><link>https://ual.sg/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/</link><pubDate>Tue, 25 Oct 2022 09:39:49 +0800</pubDate><guid>https://ual.sg/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/</guid><description>&lt;p&gt;All three recent visiting scholars, who spent a year in our Lab, have won this year&amp;rsquo;s China National Scholarship for PhDs by the government, which is the highest scholarship in the country and among the most competitive ones (awarded to top few percents of students). &amp;#x1f3c6;&lt;/p&gt;
&lt;p&gt;Big congratulations to &lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt; (Tianjin University), &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt; (Shenzhen University), and &lt;a href="https://ual.sg/author/yan-zhang/"&gt;Yan Zhang&lt;/a&gt; (Wuhan University) for this top achievement. &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;During their stay, they also published &lt;a href="https://ual.sg/publication/"&gt;papers&lt;/a&gt; in leading journals.&lt;/p&gt;
&lt;p&gt;We are honoured to have hosted you in our research group and wish you continued success in your future endeavours.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/certificate_hu_765f7272cac2a29a.webp 400w,
/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/certificate_hu_844b174bb1ca3c5b.webp 760w,
/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/certificate_hu_8866f32f35dacab3.webp 1200w"
src="https://ual.sg/post/2022/10/25/all-of-our-recent-visiting-scholars-have-been-awarded-prestigious-scholarships/certificate_hu_765f7272cac2a29a.webp"
width="760"
height="197"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>UAL at the SDSC &amp; 3D GeoInfo Joint Conference</title><link>https://ual.sg/post/2022/10/18/ual-at-the-sdsc-3d-geoinfo-joint-conference/</link><pubDate>Tue, 18 Oct 2022 12:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/10/18/ual-at-the-sdsc-3d-geoinfo-joint-conference/</guid><description>&lt;p&gt;The Lab is participating at the &lt;a href="https://www.sdsc3dgeoinfo.unsw.edu.au" target="_blank" rel="noopener"&gt;7th Smart Data Smart Cities &amp;amp; 17th 3D GeoInfo
Joint International Conference&lt;/a&gt; held in Sydney, Australia.
It is organised by The University of New South Wales (UNSW).&lt;/p&gt;
&lt;p&gt;Several members of our research group &amp;ndash; &lt;a href="https://ual.sg/author/leon-gaw/"&gt;Leon Gaw&lt;/a&gt;, &lt;a href="https://ual.sg/author/marcel-ignatius/"&gt;Marcel Ignatius&lt;/a&gt;, &lt;a href="https://ual.sg/author/shuting-chen/"&gt;Shuting Chen&lt;/a&gt;, &lt;a href="https://ual.sg/author/yoong-shin-chow/"&gt;Yoong Shin Chow&lt;/a&gt;, &lt;a href="https://ual.sg/author/kay-lee/"&gt;Kay Lee&lt;/a&gt;, &lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;, &lt;a href="https://ual.sg/author/xiucheng-liang/"&gt;Xiucheng Liang&lt;/a&gt;, &lt;a href="https://ual.sg/author/tianhong-zhao/"&gt;Tianhong Zhao&lt;/a&gt;, and &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; &amp;mdash; have contributed to four (open access) papers published in the conference proceedings:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Gaw LY, Chen S, Chow YS, Lee K, Biljecki F (2022): Comparing street view imagery and aerial perspectives in the built environment. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, X-4/W3-2022: 49-56. &lt;a href="https://doi.org/10.5194/isprs-annals-X-4-W3-2022-49-2022" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-X-4-W3-2022-49-2022&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-sdsc-svi-sat-comparison/2022-sdsc-svi-sat-comparison.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Ignatius M, Xu R, Hou Y, Liang X, Zhao T, Chen S, Wong NH, Biljecki F (2022): Local Climate Zones: Lessons from Singapore and potential improvement with street view imagery. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, X-4/W2-2022: 121-128. &lt;a href="https://doi.org/10.5194/isprs-annals-X-4-W2-2022-121-2022" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-X-4-W2-2022-121-2022&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-3-dgeoinfo-lcz-sg-svi/2022-3-dgeoinfo-lcz-sg-svi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Alva P, Biljecki F, Stouffs R (2022): Use cases for district-scale urban digital twins. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLVIII-4/W4-2022: 5-12. &lt;a href="https://doi.org/10.5194/isprs-archives-XLVIII-4-W4-2022-5-2022" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-XLVIII-4-W4-2022-5-2022&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-3-dgeoinfo-dt-use-cases/2022-3-dgeoinfo-dt-use-cases.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Pei WY, Biljecki F, Stouffs R (2022): Dataset for urban scale building stock modelling: Identification and review of potential data collection approaches. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, X-4/W2-2022: 225–232. &lt;a href="https://doi.org/10.5194/isprs-annals-X-4-W2-2022-225-2022" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-X-4-W2-2022-225-2022&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-3-dgeoinfo-urban-building-stock/2022-3-dgeoinfo-urban-building-stock.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For some of them, this is their first paper, congrats! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;h3 id="proceedings"&gt;Proceedings&lt;/h3&gt;
&lt;p&gt;The papers of the conference are published in four proceedings.
Each of the two series is published within both the ISPRS Annals and ISPRS Archives as separate volumes.
Here are the links to all the papers:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;3D GeoInfo
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/X-4-W2-2022/" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLVIII-4-W4-2022/" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;SDSC
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/X-4-W3-2022/" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLVIII-4-W5-2022/" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="citation-information"&gt;Citation information&lt;/h3&gt;
&lt;p&gt;BibTeX citations for the four papers:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_sdsc_svi_sat_comparison&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Comparing street view imagery and aerial perspectives in the built environment}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Gaw, LY and Chen, S and Chow, YS and Lee, K and Biljecki, F}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{X-4/W3-2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{49-56}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-X-4-W3-2022-49-2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_3dgeoinfo_lcz_sg_svi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Local Climate Zones: Lessons from Singapore and potential improvement with street view imagery}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ignatius, M and Xu, R and Hou, Y and Liang, X and Zhao, T and Chen, S and Wong, NH and Biljecki, F}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{X-4/W2-2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{121-128}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-X-4-W2-2022-121-2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_3dgeoinfo_dt_use_cases&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Use cases for district-scale urban digital twins}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Alva, P and Biljecki, F and Stouffs, R}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{XLVIII-4/W4-2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{5-12}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-archives-XLVIII-4-W4-2022-5-2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_3dgeoinfo_urban_building_stock&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Dataset for urban scale building stock modelling: Identification and review of potential data collection approaches}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Pei, WY and Biljecki, F and Stouffs, R}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{X-4/W2-2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{225–232}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-X-4-W2-2022-225-2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Semantic Riverscapes</title><link>https://ual.sg/post/2022/09/23/new-paper-semantic-riverscapes/</link><pubDate>Fri, 23 Sep 2022 22:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/09/23/new-paper-semantic-riverscapes/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Luo J, Zhao T, Cao L, Biljecki F (2022): Semantic Riverscapes: Perception and evaluation of linear landscapes from oblique imagery using computer vision. &lt;em&gt;Landscape and Urban Planning&lt;/em&gt; 228: 104569. &lt;a href="https://doi.org/10.1016/j.landurbplan.2022.104569" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2022.104569&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-land-semantic-riverscapes/2022-land-semantic-riverscapes.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt;.
Congratulations on his first journal paper in our Lab, great job! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The work resulted in an open dataset: &lt;a href="https://github.com/ualsg/semantic-riverscapes-dataset" target="_blank" rel="noopener"&gt;Semantic Riverscapes&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Until 2022-11-12, the article is available for free via &lt;a href="https://authors.elsevier.com/a/1foYAcUG5OuW7" target="_blank" rel="noopener"&gt;this link&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/09/23/new-paper-semantic-riverscapes/1_hu_2b2eabe121cba82b.webp 400w,
/post/2022/09/23/new-paper-semantic-riverscapes/1_hu_36245575045b11e5.webp 760w,
/post/2022/09/23/new-paper-semantic-riverscapes/1_hu_727919056a407404.webp 1200w"
src="https://ual.sg/post/2022/09/23/new-paper-semantic-riverscapes/1_hu_2b2eabe121cba82b.webp"
width="760"
height="458"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Visual perception and evaluation of landscapes are important in large-scale river analysis.&lt;/li&gt;
&lt;li&gt;A new approach using UAV oblique imagery and computer vision.&lt;/li&gt;
&lt;li&gt;A comprehensive perception study of riverscapes with bifurcated experiments.&lt;/li&gt;
&lt;li&gt;The method is automated and scalable in other geographies.&lt;/li&gt;
&lt;li&gt;The open dataset supports future studies.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/09/23/new-paper-semantic-riverscapes/2_hu_d88a15f3ecbfa1e7.webp 400w,
/post/2022/09/23/new-paper-semantic-riverscapes/2_hu_a8808ea5610e2678.webp 760w,
/post/2022/09/23/new-paper-semantic-riverscapes/2_hu_3ba1f0fb5774e4fa.webp 1200w"
src="https://ual.sg/post/2022/09/23/new-paper-semantic-riverscapes/2_hu_d88a15f3ecbfa1e7.webp"
width="760"
height="379"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Traditional approaches for visual perception and evaluation of river landscapes adopt on-site surveys or assessments through photographs. The former is expensive, hindering large-scale analyses, and it is conducted only on street-level or top-down imagery. The latter only reflects the subjective perception and also entails a laborious process. Addressing these challenges, this study proposes an alternative: a novel workflow for visual analysis of urban river landscapes by combining unmanned aerial vehicle (UAV) oblique photography with computer vision (CV) and virtual reality (VR). The approach is demonstrated with an experiment on a section of the Grand Canal in China where UAV oblique panoramic imagery has been processed using semantic segmentation for visual evaluation with an index system we designed. Concurrent surveys, immersive and non-immersive VR, are used to evaluate these photos, with a total of 111 participants expressing their perceptions across multiple dimensions. Then, the relationship between the people’s subjective visual perception and the river landscape environment as seen by computers has been established. The results suggest that using this approach, rivers and surrounding landscapes can be analyzed automatically and efficiently, and the mean pixel accuracy (MPA) of the developed model is 90%, which advances state of the art. The results of this study can benefit urban planners in formulating riverside development policies, analyzing the perception of plans for a future scenario before an area is redeveloped, and the method can also aid relevant parties in having a macro understanding of the overall situation of the river as a basis for follow-up research. Due to simplicity, accuracy and effectiveness, this workflow is transferable and cost-effective for large-scale investigations of riverscapes and linear heritage. We openly release Semantic Riverscapes—the dataset we collected and processed, bridging another gap in the field.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-land-semantic-riverscapes/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-land-semantic-riverscapes/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/09/23/new-paper-semantic-riverscapes/page-one_hu_754d11c36975d476.webp 400w,
/post/2022/09/23/new-paper-semantic-riverscapes/page-one_hu_9552e428a6dcab3e.webp 760w,
/post/2022/09/23/new-paper-semantic-riverscapes/page-one_hu_1edd6b1254812a89.webp 1200w"
src="https://ual.sg/post/2022/09/23/new-paper-semantic-riverscapes/page-one_hu_754d11c36975d476.webp"
width="588"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_land_semantic_riverscapes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Semantic Riverscapes: Perception and evaluation of linear landscapes from oblique imagery using computer vision}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Luo, Junjie and Zhao, Tianhong, and Cao, Lei and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2022.104569}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104569}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{228}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Generative Adversarial Networks in the built environment: A comprehensive review of the application of GANs across data types and scales</title><link>https://ual.sg/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/</link><pubDate>Mon, 22 Aug 2022 15:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wu AN, Stouffs R, Biljecki F (2022): Generative Adversarial Networks in the Built Environment: A Comprehensive Review of the Application of GANs across Data Types and Scales. &lt;em&gt;Building and Environment&lt;/em&gt; 223: 109477. &lt;a href="https://doi.org/10.1016/j.buildenv.2022.109477" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.buildenv.2022.109477&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-bae-gan/2022-bae-gan.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This review paper was led by &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt;.
Congratulations on his continued publication successes into his academic career. &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/1_hu_68d9853a4844e549.webp 400w,
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/1_hu_3f47325fe95f881.webp 760w,
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/1_hu_c57b35f6c03488df.webp 1200w"
src="https://ual.sg/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/1_hu_68d9853a4844e549.webp"
width="640"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Generative Adversarial Networks (GANs) are a type of deep neural network that have achieved many state-of-the-art results for generative tasks. GANs can be useful in the built environment, from processing large-scale urban mobility data and remote sensing images at the regional level, to performance analysis and design generation at the building level. We analyzed 100 articles to provide a comprehensive state-of-the-art review on how GANs are currently applied to solve challenging tasks in the built environment. Our results show that: (i) GANs are replacing older methods in some problems and setting state-of-the-art performances; (ii) GANs are opening new frontiers in previously overlooked problems, such as automatically generating spatially accurate floorplan layouts; (iii) GANs can be applied to different scales in the built environment, from entire cities to neighborhoods and buildings; and (iv) GANs are being used in a variety of problems and data types, from remote sensing data augmentation, vector data generation, spatio-temporal data privacy protection, to building design generation. In total, there are 26 unique application domains enabled by GANs; (v) however, one common challenge in this field currently is the lack of high-quality datasets curated specifically for problems in the built environment. With more data in the future, GANs could potentially produce even better results than today.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/2_hu_d13e1632060b616d.webp 400w,
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/2_hu_b80352dc66ff0b1a.webp 760w,
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/2_hu_47f53711a92be436.webp 1200w"
src="https://ual.sg/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/2_hu_d13e1632060b616d.webp"
width="760"
height="460"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-bae-gan/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-bae-gan/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/page-one_hu_664343053540ac75.webp 400w,
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/page-one_hu_325248c630ff18bc.webp 760w,
/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/page-one_hu_e4b259d83050b5a6.webp 1200w"
src="https://ual.sg/post/2022/08/22/new-paper-generative-adversarial-networks-in-the-built-environment-a-comprehensive-review-of-the-application-of-gans-across-data-types-and-scales/page-one_hu_664343053540ac75.webp"
width="562"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_bae_gan&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wu, Abraham Noah and Stouffs, Rudi and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Building and Environment}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Generative Adversarial Networks in the Built Environment: A Comprehensive Review of the Application of GANs across Data Types and Scales}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{109477}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{223}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.buildenv.2022.109477}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Unsupervised machine learning in urban studies: A systematic review of applications</title><link>https://ual.sg/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/</link><pubDate>Sat, 20 Aug 2022 08:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wang J, Biljecki F (2022): Unsupervised machine learning in urban studies: A systematic review of applications. &lt;em&gt;Cities&lt;/em&gt; 129: 103925. &lt;a href="https://doi.org/10.1016/j.cities.2022.103925" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.cities.2022.103925&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-cities-unsupervised/2022-cities-unsupervised.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This review paper was led by &lt;a href="https://ual.sg/author/jing-wang/"&gt;Jing Wang&lt;/a&gt;.
Congratulations on her first journal paper ever, great job! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/1_hu_88f77cb73457898f.webp 400w,
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/1_hu_ce7da00bd6624f56.webp 760w,
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/1_hu_f6441d4119d315a0.webp 1200w"
src="https://ual.sg/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/1_hu_88f77cb73457898f.webp"
width="760"
height="645"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;An in-depth systematic review on the applications of unsupervised learning in urban studies.&lt;/li&gt;
&lt;li&gt;140 papers reveal unsupervised learning penetrates a broad range of topics under four main themes.&lt;/li&gt;
&lt;li&gt;Introduction to the concept and common techniques.&lt;/li&gt;
&lt;li&gt;Statistical insights into evolution and prominent trends.&lt;/li&gt;
&lt;li&gt;Limitations and research opportunities of leveraging unsupervised learning in analyzing cities.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/2_hu_ddcf701dc399c0bc.webp 400w,
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/2_hu_85ad135c4d340579.webp 760w,
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/2_hu_9c31a44580dbdb96.webp 1200w"
src="https://ual.sg/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/2_hu_ddcf701dc399c0bc.webp"
width="760"
height="451"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Unsupervised learning (UL) has a long and successful history in untangling the complexity of cities. As the counterpart of supervised learning, it discovers patterns from intrinsic data structures without crafted labels, which is believed to be the key to real AI-generated decisions. This paper provides a systematic review of the use of UL in urban studies based on 140 publications. Firstly, the topic, technique, application, data type, and evaluation method of each paper are recorded, deriving statistical insights into the evolution and trends. Clustering is the most prominent method, followed by topic modeling. With the strong momentum of deep learning, a growing application field of UL methods is representing the complex real-world urban systems at multiple scales through multi-source data integration. Subsequently, a detailed review discusses how UL is applied in a broad range of urban topics, which are concluded by four dominant themes: urbanization and regional studies, built environment, urban sustainability, and urban dynamics. Finally, the review addresses common limitations regarding data quality, subjective interpretation, and validation difficulty of the results, which increasingly require interdisciplinary knowledge. Research opportunities are found in the rapidly evolving technological landscape of UL and in certain domains where supervised learning dominates.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-cities-unsupervised/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-cities-unsupervised/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/page-one_hu_22b5862d071000fb.webp 400w,
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/page-one_hu_40a70caa4668d9af.webp 760w,
/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/page-one_hu_5213f6e7390a973e.webp 1200w"
src="https://ual.sg/post/2022/08/20/new-paper-unsupervised-machine-learning-in-urban-studies-a-systematic-review-of-applications/page-one_hu_22b5862d071000fb.webp"
width="596"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_cities_unsupervised&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wang, Jing and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Cities}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Unsupervised machine learning in urban studies: A systematic review of applications}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{103925}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{129}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.cities.2022.103925}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>State of the Map 2022</title><link>https://ual.sg/post/2022/08/18/state-of-the-map-2022/</link><pubDate>Thu, 18 Aug 2022 12:18:49 +0800</pubDate><guid>https://ual.sg/post/2022/08/18/state-of-the-map-2022/</guid><description>&lt;p&gt;&lt;a href="https://stateofthemap.org" target="_blank" rel="noopener"&gt;State of the Map (SOTM)&lt;/a&gt; is the yearly summit of the &lt;a href="https://www.openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap (OSM)&lt;/a&gt; community.
&lt;a href="https://2022.stateofthemap.org" target="_blank" rel="noopener"&gt;This year&amp;rsquo;s event&lt;/a&gt; is taking place in Florence, Italy and online.
It also includes an academic track.&lt;/p&gt;
&lt;p&gt;We have two key contributions to this year&amp;rsquo;s edition.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt; is part of the Scientific Committee for the &lt;a href="https://2022.stateofthemap.org/calls/academic/" target="_blank" rel="noopener"&gt;Academic Track&lt;/a&gt;.
It is its fifth edition.
The papers of the academic track are published open access at &lt;a href="https://zenodo.org/communities/sotm-22/" target="_blank" rel="noopener"&gt;Zenodo&lt;/a&gt; (make sure you also check &lt;a href="https://doi.org/10.5281/zenodo.7004791" target="_blank" rel="noopener"&gt;the editorial&lt;/a&gt; co-authored by Pengyuan).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/shiyue-zhong/"&gt;Shiyue Zhong&lt;/a&gt; is giving &lt;a href="https://2022.stateofthemap.org/sessions/RHF3UX/" target="_blank" rel="noopener"&gt;a talk&lt;/a&gt; on our ongoing research (&lt;em&gt;Exploring Human Bias and Effects of Training in OSM mapping: A Behavioral Experiment in Singapore&lt;/em&gt;).&lt;/p&gt;
&lt;p&gt;Besides the &lt;a href="https://2022.stateofthemap.org/programme/" target="_blank" rel="noopener"&gt;talks&lt;/a&gt;, consider checking out the &lt;a href="https://2022.stateofthemap.org/posters/" target="_blank" rel="noopener"&gt;posters&lt;/a&gt; as well.&lt;/p&gt;
&lt;p&gt;Thanks to the organisers for this wonderful event, and to the &lt;a href="https://2022.stateofthemap.org/#sponsors" target="_blank" rel="noopener"&gt;sponsors&lt;/a&gt; for supporting it!&lt;/p&gt;</description></item><item><title>New paper: A review of spatially-explicit GeoAI applications in Urban Geography</title><link>https://ual.sg/post/2022/08/12/new-paper-a-review-of-spatially-explicit-geoai-applications-in-urban-geography/</link><pubDate>Fri, 12 Aug 2022 05:11:16 +0800</pubDate><guid>https://ual.sg/post/2022/08/12/new-paper-a-review-of-spatially-explicit-geoai-applications-in-urban-geography/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Liu P, Biljecki F (2022): A review of spatially-explicit GeoAI applications in Urban Geography. &lt;em&gt;International Journal of Applied Earth Observation and Geoinformation&lt;/em&gt; 112: 102936. &lt;a href="https://doi.org/10.1016/j.jag.2022.102936" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.jag.2022.102936&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-jag-geoai/2022-jag-geoai.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This review paper was led by &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;.
Congratulations on his first journal paper during his tenure in the Lab. &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Techniques and applications of spatially-explicit GeoAI in Urban Geography.&lt;/li&gt;
&lt;li&gt;The review focuses on urban dynamics, social differentiation, and social sensing.&lt;/li&gt;
&lt;li&gt;The development of this line in Urban Geography is still in its early phase.&lt;/li&gt;
&lt;li&gt;Graph neural networks are promising solutions to incorporate spatial information.&lt;/li&gt;
&lt;li&gt;Challenges identified are data, scale, MAUP, and lack of interpretation (black box).&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban Geography studies forms, social fabrics, and economic structures of cities from a geographic perspective. Catalysed by the increasingly abundant spatial big data, Urban Geography seeks new models and research paradigms to explain urban phenomena and address urban issues. Recent years have witnessed significant advances in spatially-explicit geospatial artificial intelligence (GeoAI), which integrates spatial studies and AI, primarily focusing on incorporating spatial thinking and concept into deep learning models for urban studies. This paper provides an overview of techniques and applications of spatially-explicit GeoAI in Urban Geography based on 581 papers identified using a systematic review approach. We examined and screened papers in three scopes of Urban Geography (Urban Dynamics, Social Differentiation of Urban Areas, and Social Sensing) and found that although GeoAI is a trending topic in geography and the applications of deep neural network-based methods are proliferating, the development of spatially-explicit GeoAI models is still at their early phase. We identified three challenges of existing models and advised future research direction towards developing multi-scale explainable spatially-explicit GeoAI. This review paper acquaints beginners with the basics of GeoAI and state-of-the-art and serve as an inspiration to attract more research in exploring the potential of spatially-explicit GeoAI in studying the socio-economic dimension of the city and urban life.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-jag-geoai/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-jag-geoai/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/12/new-paper-a-review-of-spatially-explicit-geoai-applications-in-urban-geography/page-one_hu_c4a8270264003e6d.webp 400w,
/post/2022/08/12/new-paper-a-review-of-spatially-explicit-geoai-applications-in-urban-geography/page-one_hu_b90531c7019fdabc.webp 760w,
/post/2022/08/12/new-paper-a-review-of-spatially-explicit-geoai-applications-in-urban-geography/page-one_hu_844665db4bc86266.webp 1200w"
src="https://ual.sg/post/2022/08/12/new-paper-a-review-of-spatially-explicit-geoai-applications-in-urban-geography/page-one_hu_c4a8270264003e6d.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_jag_geoai&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Liu, Pengyuan and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Applied Earth Observation and Geoinformation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{A review of spatially-explicit GeoAI applications in Urban Geography}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102936}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.jag.2022.102936}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: 3D building metrics for urban morphology</title><link>https://ual.sg/post/2022/08/01/new-paper-3d-building-metrics-for-urban-morphology/</link><pubDate>Mon, 01 Aug 2022 20:51:16 +0800</pubDate><guid>https://ual.sg/post/2022/08/01/new-paper-3d-building-metrics-for-urban-morphology/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper in which we were involved:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Labetski A, Vitalis S, Biljecki F, Arroyo Ohori K, Stoter J (2023): 3D building metrics for urban morphology. &lt;em&gt;International Journal of Geographical Information Science&lt;/em&gt;, 37(1): 36-67. &lt;a href="https://doi.org/10.1080/13658816.2022.2103818" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1080/13658816.2022.2103818&lt;/a&gt; &lt;a href="https://ual.sg/publication/2023-ijgis-3-dbm/2023-ijgis-3-dbm.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Congratulations to &lt;a href="http://3d.bk.tudelft.nl/alabetski" target="_blank" rel="noopener"&gt;Anna Labetski&lt;/a&gt; and &lt;a href="http://3d.bk.tudelft.nl/svitalis" target="_blank" rel="noopener"&gt;Stelios Vitalis&lt;/a&gt; from the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group&lt;/a&gt; at TU Delft for the publication of their work, and thanks for this productive and exciting collaboration!&lt;/p&gt;
&lt;p&gt;The paper introduces an advanced set of 3D metrics to characterise buildings and takes advantage of the increasing availability and detail of 3D building models (related work measures only basic metrics such as the height of buildings and envelope area).
We believe that it sets the scene for truly 3D urban morphology studies.
The software developed is &lt;a href="https://github.com/tudelft3d/3d-building-metrics" target="_blank" rel="noopener"&gt;released open-source&lt;/a&gt;, and the dataset used for the research, containing metrics for 823,000 buildings in the Netherlands, is &lt;a href="https://doi.org/10.7910/DVN/6QCRRF" target="_blank" rel="noopener"&gt;released as open data&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Urban morphology is important in a broad range of investigations across the fields of city planning, transportation, climate, energy, and urban data science. Characterising buildings with a set of numerical metrics is fundamental to studying the urban form. Despite the rapid developments in 3D geoinformation science, and the growing 3D data availability, most studies simplify buildings to their 2D footprint, and when taking their height into account, they at most assume one height value per building, i.e. simple 3D. We take the first step in elevating building metrics into full/true 3D, uncovering the use of higher levels of detail, and taking into account the detailed shape of a building. We set the foundation of the new research line on 3D urban morphology by providing a comprehensive set of 3D metrics, implementing them in openly released software, generating an open dataset containing 2D and 3D metrics for 823,000 buildings in the Netherlands, and demonstrating a use case where clusters and architectural patterns are analysed through time. Our experiments suggest the added value of 3D metrics to complement existing counterparts, reducing ambiguity, and providing advanced insights. Furthermore, we provide a comparative analysis using different levels of detail of 3D building models.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2023-ijgis-3-dbm/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2023-ijgis-3-dbm/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/08/01/new-paper-3d-building-metrics-for-urban-morphology/page-one_hu_b79cc04fc9aa6f12.webp 400w,
/post/2022/08/01/new-paper-3d-building-metrics-for-urban-morphology/page-one_hu_5469c3a174ed7cbb.webp 760w,
/post/2022/08/01/new-paper-3d-building-metrics-for-urban-morphology/page-one_hu_57c3c4dcf6c43e97.webp 1200w"
src="https://ual.sg/post/2022/08/01/new-paper-3d-building-metrics-for-urban-morphology/page-one_hu_b79cc04fc9aa6f12.webp"
width="746"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2023_ijgis_3dbm&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Labetski, Anna and Vitalis, Stelios and Biljecki, Filip and Arroyo Ohori, Ken and Stoter, Jantien}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1080/13658816.2022.2103818}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Geographical Information Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{3D building metrics for urban morphology}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2023}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{37}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{36-67}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Urban Analytics Lab on tour in the United States</title><link>https://ual.sg/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/</link><pubDate>Thu, 21 Jul 2022 08:23:28 +0800</pubDate><guid>https://ual.sg/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/</guid><description>&lt;p&gt;The PI of the NUS Urban Analytics Lab, Prof &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, has been on a work trip in the US &amp;#x1f1fa;&amp;#x1f1f8; where he has visited 9 universities and delivered lectures, talks, and other collaborative exchanges.&lt;/p&gt;
&lt;p&gt;The full list of visited universities and groups is as follows, with some photos below.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Massachusetts Institute of Technology, &lt;a href="https://senseable.mit.edu/" target="_blank" rel="noopener"&gt;Senseable City Lab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Harvard University, &lt;a href="https://gis.harvard.edu/" target="_blank" rel="noopener"&gt;Center for Geographic Analysis&lt;/a&gt; (&lt;a href="https://gis.harvard.edu/event/geospatial-research-urban-analytics-lab-national-university-singapore" target="_blank" rel="noopener"&gt;link&lt;/a&gt; to the talk)&lt;/li&gt;
&lt;li&gt;Massachusetts Institute of Technology, &lt;a href="http://web.mit.edu/sustainabledesignlab/" target="_blank" rel="noopener"&gt;Sustainable Design Lab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;University of Pennsylvania, &lt;a href="https://www.design.upenn.edu/" target="_blank" rel="noopener"&gt;Stuart Weitzman School of Design&lt;/a&gt;, &lt;a href="https://thermal-architecture.org/" target="_blank" rel="noopener"&gt;Thermal Architecture Lab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Princeton University, &lt;a href="https://soa.princeton.edu/" target="_blank" rel="noopener"&gt;School of Architecture&lt;/a&gt;, &lt;a href="https://soa.princeton.edu/content/c.h..o.s.-lab" target="_blank" rel="noopener"&gt;CHAOS Lab&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Carnegie Mellon University, &lt;a href="https://soa.cmu.edu/" target="_blank" rel="noopener"&gt;School of Architecture&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.colorado.edu/" target="_blank" rel="noopener"&gt;University of Colorado Boulder&lt;/a&gt; / &lt;a href="https://www.nrel.gov/index.html" target="_blank" rel="noopener"&gt;National Renewable Energy Laboratory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.lbl.gov/" target="_blank" rel="noopener"&gt;Lawrence Berkeley National Laboratory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;University of California, Berkeley, &lt;a href="https://cbe.berkeley.edu/" target="_blank" rel="noopener"&gt;Center for the Built Environment&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Stanford University, &lt;a href="https://cee.stanford.edu/" target="_blank" rel="noopener"&gt;Civil and Environmental Engineering&lt;/a&gt;, &lt;a href="https://www.uil.stanford.edu/" target="_blank" rel="noopener"&gt;Urban Informatics Lab&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;We look forward to collaborating with these great research groups, and thank them for their hospitality.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/1_hu_b13da76cce71ad30.webp 400w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/1_hu_99847c3ba5e64073.webp 760w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/1_hu_3d3ef37eb8432dc3.webp 1200w"
src="https://ual.sg/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/1_hu_b13da76cce71ad30.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/2_hu_fb8f2505d6a2f571.webp 400w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/2_hu_bc91b555862f7272.webp 760w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/2_hu_cca475ef2f4da800.webp 1200w"
src="https://ual.sg/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/2_hu_fb8f2505d6a2f571.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/3_hu_477a459850972506.webp 400w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/3_hu_22b8c5dff8c96fc5.webp 760w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/3_hu_cfe0a4c09a393ce4.webp 1200w"
src="https://ual.sg/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/3_hu_477a459850972506.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/4_hu_9c85eab7ca567731.webp 400w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/4_hu_789abc057400da96.webp 760w,
/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/4_hu_6112d456f8ff9d78.webp 1200w"
src="https://ual.sg/post/2022/07/21/urban-analytics-lab-on-tour-in-the-united-states/4_hu_9c85eab7ca567731.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Credits to &lt;a href="http://www.kkyyhh96.site/" target="_blank" rel="noopener"&gt;Yuhao Kang&lt;/a&gt; and &lt;a href="https://scholar.harvard.edu/junghwankim/home" target="_blank" rel="noopener"&gt;Junghwan Kim&lt;/a&gt; for some of the photos. &amp;#x1f64f;&lt;/p&gt;</description></item><item><title>Our seminar on UAV</title><link>https://ual.sg/post/2022/07/21/our-seminar-on-uav/</link><pubDate>Thu, 21 Jul 2022 07:33:28 +0800</pubDate><guid>https://ual.sg/post/2022/07/21/our-seminar-on-uav/</guid><description>&lt;p&gt;Our researchers &lt;a href="https://ual.sg/author/leon-gaw/"&gt;Leon Gaw&lt;/a&gt; and &lt;a href="https://ual.sg/author/junjie-luo/"&gt;Junjie Luo&lt;/a&gt; have conducted a seminar about UAV for the rest of the group, and discussed their potential applications in research.
Drones are increasingly important in research of the built environment.
For example, Junjie has been busy collecting and segmenting oblique imagery of riverscapes in Tianjin, China to support his PhD research on perception and participatory planning.
He has released the dataset openly (see the &lt;a href="https://github.com/ualsg/semantic-riverscapes-dataset" target="_blank" rel="noopener"&gt;Github repo&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/our-seminar-on-uav/1_hu_9d68a09df6057727.webp 400w,
/post/2022/07/21/our-seminar-on-uav/1_hu_ab52a42ba320d382.webp 760w,
/post/2022/07/21/our-seminar-on-uav/1_hu_8c37d5f14e32af2f.webp 1200w"
src="https://ual.sg/post/2022/07/21/our-seminar-on-uav/1_hu_9d68a09df6057727.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/our-seminar-on-uav/2_hu_6f3fec061464f0b8.webp 400w,
/post/2022/07/21/our-seminar-on-uav/2_hu_de543b883690a118.webp 760w,
/post/2022/07/21/our-seminar-on-uav/2_hu_5aa01a37c7e6d9db.webp 1200w"
src="https://ual.sg/post/2022/07/21/our-seminar-on-uav/2_hu_6f3fec061464f0b8.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/07/21/our-seminar-on-uav/3_hu_7e61f783dafa5f16.webp 400w,
/post/2022/07/21/our-seminar-on-uav/3_hu_7049fe2706aa658c.webp 760w,
/post/2022/07/21/our-seminar-on-uav/3_hu_d8ee0b67adaf7403.webp 1200w"
src="https://ual.sg/post/2022/07/21/our-seminar-on-uav/3_hu_7e61f783dafa5f16.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Many thanks to Junjie and Leon! &amp;#x1f64f;&lt;/p&gt;</description></item><item><title>New paper: 3D building reconstruction from single street view images using deep learning</title><link>https://ual.sg/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/</link><pubDate>Fri, 17 Jun 2022 13:21:16 +0800</pubDate><guid>https://ual.sg/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Pang HE, Biljecki F (2022): 3D building reconstruction from single street view images using deep learning. &lt;em&gt;International Journal of Applied Earth Observation and Geoinformation&lt;/em&gt; 112: 102859. &lt;a href="https://doi.org/10.1016/j.jag.2022.102859" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.jag.2022.102859&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-jag-3-d-svi/2022-jag-3-d-svi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This research was led by &lt;a href="https://ual.sg/author/hui-en-pang/"&gt;Hui En Pang&lt;/a&gt; as part of her MSc.
Congratulations to her on the graduation, and we wish Hui En all the best in her PhD at NTU. &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Street-level photographs are now an omnipresent urban dataset.&lt;/li&gt;
&lt;li&gt;Buildings are often imaged multiple times, but most photos are obstructed.&lt;/li&gt;
&lt;li&gt;An approach to generate 3D models of buildings from their single views.&lt;/li&gt;
&lt;li&gt;The method can be aided by building footprints.&lt;/li&gt;
&lt;li&gt;The resulting models are usable for a variety of use cases.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;3D building models are an established instance of geospatial information in the built environment, but their acquisition remains complex and topical. Approaches to reconstruct 3D building models often require existing building information (e.g. their footprints) and data such as point clouds, which are scarce and laborious to acquire, limiting their expansion. In parallel, street view imagery (SVI) has been gaining currency, driven by the rapid expansion in coverage and advances in computer vision (CV), but it has not been used much for generating 3D city models. Traditional approaches that can use SVI for reconstruction require multiple images, while in practice, often only few street-level images provide an unobstructed view of a building. We develop the reconstruction of 3D building models from a single street view image using image-to-mesh reconstruction techniques modified from the CV domain. We regard three scenarios: (1) standalone single-view reconstruction; (2) reconstruction aided by a top view delineating the footprint; and (3) refinement of existing 3D models, i.e. we examine the use of SVI to enhance the level of detail of block (LoD1) models, which are common. The results suggest that trained models supporting (2) and (3) are able to reconstruct the overall geometry of a building, while the first scenario may derive the approximate mass of the building, useful to infer the urban form of cities. We evaluate the results by demonstrating their usefulness for volume estimation, with mean errors of less than 10% for the last two scenarios. As SVI is now available in most countries worldwide, including many regions that do not have existing footprint and/or 3D building data, our method can derive rapidly and cost-effectively the 3D urban form from SVI without requiring any existing building information. Obtaining 3D building models in regions that hitherto did not have any, may enable a number of 3D geospatial analyses locally for the first time.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-jag-3-d-svi/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-jag-3-d-svi/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/page-one_hu_bc15a9d9570efef8.webp 400w,
/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/page-one_hu_c6225c4b14efefaa.webp 760w,
/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/page-one_hu_e79fea783aae95cc.webp 1200w"
src="https://ual.sg/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/page-one_hu_bc15a9d9570efef8.webp"
width="573"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_jag_3d_svi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Pang, Hui En and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Applied Earth Observation and Geoinformation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{3D building reconstruction from single street view images using deep learning}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{102859}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.jag.2022.102859}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Free and open source urbanism: Software for urban planning practice</title><link>https://ual.sg/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/</link><pubDate>Fri, 10 Jun 2022 07:31:16 +0800</pubDate><guid>https://ual.sg/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Yap W, Janssen P, Biljecki F (2022): Free and open source urbanism: Software for urban planning practice. &lt;em&gt;Computers, Environment and Urban Planning&lt;/em&gt; 96: 101825. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2022.101825" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2022.101825&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ceus-open-source-urbanism/2022-ceus-open-source-urbanism.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This review paper was led by &lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;.
Congratulations on his first journal paper out of his PhD, in a top journal no less. &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/urban-planning-process_hu_b1f496a4d213e9df.webp 400w,
/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/urban-planning-process_hu_1a8d3b85125d29b0.webp 760w,
/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/urban-planning-process_hu_c4b8b1ee0e1dd93a.webp 1200w"
src="https://ual.sg/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/urban-planning-process_hu_b1f496a4d213e9df.webp"
width="588"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;State of the art of open source software for urban planning.&lt;/li&gt;
&lt;li&gt;Open source software is increasingly supporting urban planning.&lt;/li&gt;
&lt;li&gt;The current landscape of open tools presents numerous opportunities to augment a wide range of urban analytical processes.&lt;/li&gt;
&lt;li&gt;70 relevant tools for urban planning, categorised according to planning process phases, application domains, and use cases.&lt;/li&gt;
&lt;li&gt;An extended list of 54 peripheral tools that provide additional support for domains related to urban planning.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Free and open source tools present numerous opportunities to support current urban planning practice. However, their overview is fragmented, and the uptake among planning professionals remains lacklustre. Recent discourse in the domain of planning support tools attribute poor take up to the lack of understanding on the landscape and functionality of available tools, and how tools can add value to the planning process. We provide an understanding of the state of the art concerning open source tools for urban planning from journal articles, software repositories, and social media. Our search documented 70 open source tools that support different stages of the urban planning process. We cover an additional set of 54 peripheral tools to support domains related to urban planning. In the process, we formalise and describe the urban planning process and find that the entire planning process can be conducted using open source software.
Tools focusing on street networks and geographic spatial analysis are the mainstay of current implementation. Sixty percent of tools are only accessible through an application programming interface, while 43% rely on Python for development. The scenario planning, public participation, and evaluation phases of the planning process present many untapped opportunities for open source software development. Findings will help urban planners and researchers to employ these tools for professional practice, and assist software developers to identify opportunities for software development in urban research.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ceus-open-source-urbanism/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ceus-open-source-urbanism/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/page-one_hu_a3c283412123e555.webp 400w,
/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/page-one_hu_6ad4e3174ffd3527.webp 760w,
/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/page-one_hu_b262a29e998d6924.webp 1200w"
src="https://ual.sg/post/2022/06/10/new-paper-free-and-open-source-urbanism-software-for-urban-planning-practice/page-one_hu_a3c283412123e555.webp"
width="564"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ceus_open_source_urbanism&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Yap, Winston and Janssen, Patrick and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2022.101825}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{101825}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Free and open source urbanism: Software for urban planning practice}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{96}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Creating a map for the future</title><link>https://ual.sg/post/2022/06/09/creating-a-map-for-the-future/</link><pubDate>Thu, 09 Jun 2022 09:33:28 +0800</pubDate><guid>https://ual.sg/post/2022/06/09/creating-a-map-for-the-future/</guid><description>&lt;p&gt;The PI of our Lab, Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, was featured by NUS in their series &lt;a href="https://news.nus.edu.sg/?h=1&amp;amp;t=Proof%20of%20Passion" target="_blank" rel="noopener"&gt;&lt;em&gt;Proof of Passion&lt;/em&gt;&lt;/a&gt;.
In this series, &lt;a href="https://news.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS News&lt;/a&gt; profiles the University’s Presidential Young Professors who are at the forefront of their research fields, turning creative ideas into important innovations that make the world better.&lt;/p&gt;
&lt;p&gt;We thank NUS for their support and for writing about our research.&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://news.nus.edu.sg/creating-a-map-for-the-future" target="_blank" rel="noopener"&gt;full article&lt;/a&gt; is copied below.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Maps have been one of the most important ways that humans have understood the physical world. They have mediated global trade routes, decided the victors of wars and helped the construction of towering metropolises. Today, in the digital age, we can log onto our computers and point a cursor at the furthest reaches of the globe, and see a map and street view of its cities and towns.&lt;/p&gt;
&lt;p&gt;It is not surprising that many of us assume that the world is basically known to us, and there are no more gaps in our maps to fill. However, Presidential Young Professor &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; from the &lt;a href="https://cde.nus.edu.sg/arch/" target="_blank" rel="noopener"&gt;Department of Architecture&lt;/a&gt; under the &lt;a href="https://cde.nus.edu.sg/" target="_blank" rel="noopener"&gt;NUS College of Design and Engineering&lt;/a&gt;, and the &lt;a href="https://bschool.nus.edu.sg/real-estate/" target="_blank" rel="noopener"&gt;Department of Real Estate&lt;/a&gt; under the &lt;a href="https://bschool.nus.edu.sg/" target="_blank" rel="noopener"&gt;NUS Business School&lt;/a&gt;, explains that is not the case. Sometimes, the most unknown places are right outside our doors. How tall are the buildings that we see outside our windows? How many of them have rooftop gardens or solar panels? How many of those buildings receive direct sunlight or a nice cooling breeze?&lt;/p&gt;
&lt;p&gt;Far from being trivial questions, these missing datasets hold invaluable insights for the coming age of urban planning. In the Singaporean context, Assistant Professor Biljecki points out that it is vital to know how the shape and size of buildings impact variables like energy consumption, urban vibrancy, and the microclimate. If too many buildings are in direct sunlight and do not get airflow, it is likely that more energy will be spent on cooling – leaving a bigger carbon footprint. “One of the things that we noticed talking to people who use this kind of data, but who are not currently in the field like we are, is that they take data for granted, especially the quality,” he said.&lt;/p&gt;
&lt;h3 id="filling-the-knowledge-gap-in-modern-maps"&gt;Filling the knowledge gap in modern maps&lt;/h3&gt;
&lt;p&gt;Asst Prof Biljecki, who specialises in geospatial data science, joined NUS in 2017. Here, he established the NUS Urban Analytics Lab, a research group dedicated to geospatial and 3D urban modelling, and their work aligns well with the University’s key research priorities of developing integrative sustainability solutions as well as capabilities to accelerate Singapore’s transformation into a smart nation.&lt;/p&gt;
&lt;p&gt;Much of his work looks at plugging the knowledge gaps in our modern maps. Some of these gaps exist because data gathering is expensive. Before the advent of artificial intelligence, data gathering was done by surveyors manually, which led to huge data inequalities between different stakeholders and countries. Other problems, like the varying levels of information transparency around the world, also impact global data access. Countries that are the most impoverished are unable to use data to guide many developmental policy decisions.&lt;/p&gt;
&lt;p&gt;To overcome this, Asst Prof Biljecki and the researchers he works with are taking advantage of modern computing techniques as much as possible. “Artificial intelligence has really brought a revolution to my field,” he said, “It has finally enabled the automatic extraction of data for a fraction of the cost of previous approaches.”&lt;/p&gt;
&lt;p&gt;In the past, the teams that Asst Prof Biljecki has worked in have used computer vision and images from satellites or street views to estimate variables like building heights. By using this unprecedented visibility of urban spaces, researchers can now begin to fill in the blank spaces in our knowledge. In one of his co-authored papers, Asst Prof Biljecki used &lt;a href="https://news.nus.edu.sg/nus-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/" target="_blank" rel="noopener"&gt;AI to count the number of rooftops in 17 cities that were covered in solar panels or green spaces&lt;/a&gt;: a testament to how far satellite imaging has come. “Geospatial technologies can help us to answer if it is beneficial to put solar panels on, let&amp;rsquo;s say, 20% of rooftops in a city. Thanks to simulations, we can estimate whether that makes economic sense and whether it makes environmental sense,” he said.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-asst-prof-biljecki-right-and-mr-abraham-noah-wu-left-showing-the-features-of-roofpedia-an-automated-tool-that-they-had-developed-which-uses-satellite-images-to-track-solar-and-green-roof-penetration"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Asst Prof Biljecki (right) and Mr Abraham Noah Wu (left) showing the features of Roofpedia, an automated tool that they had developed, which uses satellite images to track solar and green roof penetration." srcset="
/post/2022/06/09/creating-a-map-for-the-future/1920_filipbiljecki-roofpedia_hu_cf662ccd02245d9e.webp 400w,
/post/2022/06/09/creating-a-map-for-the-future/1920_filipbiljecki-roofpedia_hu_f8ec48d301fab683.webp 760w,
/post/2022/06/09/creating-a-map-for-the-future/1920_filipbiljecki-roofpedia_hu_b6ccbe9e1911b5ae.webp 1200w"
src="https://ual.sg/post/2022/06/09/creating-a-map-for-the-future/1920_filipbiljecki-roofpedia_hu_cf662ccd02245d9e.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Asst Prof Biljecki (right) and Mr Abraham Noah Wu (left) showing the features of Roofpedia, an automated tool that they had developed, which uses satellite images to track solar and green roof penetration.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Other applications of his work are so wide-ranging that it is challenging to unify them into distinct categories. In 2021, Asst Prof Biljecki’s team published a method for characterising cooling systems in buildings just by using infrared images of the air-conditioning units mounted on the building’s exterior. Equipped with that information, people can now get a clearer picture of the energy consumption behaviour of a building’s residents. In the same year, Asst Prof Biljecki also helped to develop a comprehensive index of how ‘bike-able’ different cities are with the aid of street view imagery. Future city planners may one day use it to build more bicycle-friendly infrastructure and encourage greener transport.&lt;/p&gt;
&lt;h3 id="open-sharing-of-data-interdisciplinary-collaboration-the-way-forward"&gt;Open sharing of data, interdisciplinary collaboration the way forward&lt;/h3&gt;
&lt;p&gt;Geospatial data is multimodal and multidisciplinary, and Asst Prof Biljecki prides himself on embracing the colourful nature of his research.&lt;/p&gt;
&lt;p&gt;Datasets about urban environments unsurprisingly intersect many different disciplines. After all, a city is a nexus point where important issues meet: energy, transport, livability, smart technology, and more. Asst Prof Biljecki said he has become more multidisciplinary with each project. “Even when walking out into the city, one inevitable thing I notice is how intertwined everything is,” he commented.&lt;/p&gt;
&lt;p&gt;In the past, he has worked with researchers with specialties from computing to transportation, and is a strong believer that this trend will only strengthen over time.&lt;/p&gt;
&lt;p&gt;Strengthening the open sharing and use of data and resources is something that Asst Prof Biljecki feels strongly about. Much of the work he does is open-source: available for anyone to use and adapt. Increasing access to data also allows for more analyses to be conducted by NGOs, academics and corporations, many of whom are interested in the same questions.&lt;/p&gt;
&lt;p&gt;For countries with low amounts of data, crowdsourcing is often a viable option, Asst Prof Biljecki shared, “There are crowdsourced efforts that are voluntary. People use their time and effort to map cities – one example is OpenStreetMap, the world&amp;rsquo;s largest repository of crowdsourced geospatial data.” Such crowdsourced repositories will fuel research in the future. The race to build smarter and better cities is not won by the efforts of a few, but by the efforts of many people who help each other: a democratic, communal effort.&lt;/p&gt;
&lt;p&gt;On a foundation of more comprehensive and widely available data, the future possibilities are endless. Asst Prof Biljecki has a few predictions for what he believes the future will hold.&lt;/p&gt;
&lt;p&gt;There are the perennial developmental issues such as the growth of smart cities and achieving carbon neutrality that he will continue to be involved in. For those, Asst Prof Biljecki believes that ‘digital twins’ could serve as an invaluable tool for future city-wide simulations. Digital twins are virtual representations of cityscapes. They can be something like a relatively simple 3D map of a city with real-time information or full models that can simulate urban micro-climates, and could help policymakers and researchers better understand how to build back greener and smarter.&lt;/p&gt;
&lt;p&gt;More Proof of Passion stories &lt;a href="https://news.nus.edu.sg/?h=1&amp;amp;t=Proof%20of%20Passion" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;The NUS Presidential Young Professorship (PYP) scheme supports talented young academics with excellent research track records in advancing their cutting-edge research. More information about the PYP scheme is available &lt;a href="https://www.nus.edu.sg/careers/NUS-Presidential-Young-Professorship.pdf" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;</description></item><item><title>ifc2indoorgml: An open-source tool for generating IndoorGML from IFC</title><link>https://ual.sg/post/2022/06/05/ifc2indoorgml-an-open-source-tool-for-generating-indoorgml-from-ifc/</link><pubDate>Sun, 05 Jun 2022 17:51:16 +0800</pubDate><guid>https://ual.sg/post/2022/06/05/ifc2indoorgml-an-open-source-tool-for-generating-indoorgml-from-ifc/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper in which we were involved:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Diakite, AA, Díaz-Vilariño L, Biljecki F, Isikdag Ü, Simmons S, Li K, Zlatanova S (2022): ifc2indoorgml: An open-source tool for generating IndoorGML from IFC. &lt;em&gt;Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; XLIII-B4-2022: 295–301. &lt;a href="https://doi.org/10.5194/isprs-archives-xliii-b4-2022-295-2022" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-xliii-b4-2022-295-2022&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-isprs-ifc-2-indoorgml/2022-isprs-ifc-2-indoorgml.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This conference paper describes &lt;a href="https://github.com/grid-unsw/ifc2indoorgml" target="_blank" rel="noopener"&gt;ifc2indoorgml&lt;/a&gt;, a tool to generate IndoorGML files from IFC input models.
The collaboration was supported by the &lt;a href="https://www.isprs.org/society/si/default.aspx" target="_blank" rel="noopener"&gt;ISPRS Scientific Initiatives 2021&lt;/a&gt; and the Ministry of Land, Infrastructure, and Transport of Korean government through the iNOUS initiative leb by Pusan National University.&lt;/p&gt;
&lt;p&gt;The project, development, and paper were led by &lt;a href="https://www.unsw.edu.au/staff/abdoulaye-diakite" target="_blank" rel="noopener"&gt;Dr Abdoulaye Diakite&lt;/a&gt; from the &lt;a href="https://www.unsw.edu.au/arts-design-architecture/our-schools/built-environment/our-research/clusters-groups/grid" target="_blank" rel="noopener"&gt;Geospatial, Research, Innovation and Development (GRID) Lab&lt;/a&gt; at the University of New South Wales (Australia).
Others involved in this project are collaborators from University of Vigo (Spain), Mimar Sinan Fine Arts University (Turkey), Open Geospatial Consortium, and the Pusan National University (South Korea).&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The interest in 3D indoor models has been continuously growing. Most such models are made available as point clouds or BIM (e.g., IFC), the former being generally provided as unstructured information while the latter comes highly structured and rich in semantic information. IFC models are consequently more suitable for direct use, but they can be very complex and contain too many details, which often raises privacy concerns. IndoorGML is one of the standards for describing 3D indoor space with the purpose of supporting Location Based Services (LBS). It relies on solid scientific concepts and offers a high flexibility with extension mechanisms. It provides a geometric, topological, and semantic description of the indoor which facilitates specifically applications like indoor navigation or facility management. Additionally, it can represent complex indoor environments without compromising privacy, thanks to its high level of abstraction. However, despite its solid conceptual basis, IndoorGML is suffering from a lack of practical tools and remains hard to produce, making it largely unavailable. In this project, we developed an open-source tool named ifc2indoorgml allowing to automatically generate IndoorGML models from IFC data. We discuss the workflow and the different development approaches. By making such tool available to the wider public, we expect more 3D IndoorGML models to be created and made freely available for research and development within the spatial community and beyond.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-isprs-ifc-2-indoorgml/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-isprs-ifc-2-indoorgml/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/06/05/ifc2indoorgml-an-open-source-tool-for-generating-indoorgml-from-ifc/page-one_hu_fae1c9f3b64d2836.webp 400w,
/post/2022/06/05/ifc2indoorgml-an-open-source-tool-for-generating-indoorgml-from-ifc/page-one_hu_18b2b08c327862a2.webp 760w,
/post/2022/06/05/ifc2indoorgml-an-open-source-tool-for-generating-indoorgml-from-ifc/page-one_hu_ed39d888526adfc8.webp 1200w"
src="https://ual.sg/post/2022/06/05/ifc2indoorgml-an-open-source-tool-for-generating-indoorgml-from-ifc/page-one_hu_fae1c9f3b64d2836.webp"
width="528"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_isprs_ifc2indoorgml&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Diakite, AA and Díaz-Vilariño, L and Biljecki, F and Isikdag, Ü and Simmons, S and Li, K and Zlatanova, S}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-archives-xliii-b4-2022-295-2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{295--301}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ifc2indoorgml: An open-source tool for generating IndoorGML from IFC}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{XLIII-B4-2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Infrared thermography in the built environment: A multi-scale review</title><link>https://ual.sg/post/2022/05/24/new-paper-infrared-thermography-in-the-built-environment-a-multi-scale-review/</link><pubDate>Tue, 24 May 2022 13:51:16 +0800</pubDate><guid>https://ual.sg/post/2022/05/24/new-paper-infrared-thermography-in-the-built-environment-a-multi-scale-review/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper in which we were involved:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Martin M, Chong A, Biljecki F, Miller C (2022): Infrared thermography in the built environment: A multi-scale review. &lt;em&gt;Renewable and Sustainable Energy Reviews&lt;/em&gt; 165: 112540. &lt;a href="https://doi.org/10.1016/j.rser.2022.112540" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.rser.2022.112540&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-rser-thermography-review/2022-rser-thermography-review.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This review paper was led by &lt;a href="https://scholar.google.com/citations?user=KngdHq4AAAAJ&amp;amp;hl=en" target="_blank" rel="noopener"&gt;Dr Miguel Martin&lt;/a&gt; from the &lt;a href="https://bears.berkeley.edu" target="_blank" rel="noopener"&gt;Berkeley Education Alliance for Research in Singapore&lt;/a&gt;.
Our sister labs &amp;mdash; the &lt;a href="https://www.budslab.org" target="_blank" rel="noopener"&gt;Building and Urban Data Science (BUDS) Lab&lt;/a&gt; and the &lt;a href="https://ideaslab.io" target="_blank" rel="noopener"&gt;Integrated Data, Energy Analysis + Simulation (IDEAS) Lab&lt;/a&gt; &amp;mdash; have also been involved.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The paper presents a review on major contributions in infrared thermography to study the built environment at multiple scales. To elaborate the review, hundreds of studies conducted between the 1980s and 2020s were first selected based on their relevance to the scope. Afterward, the most relevant contributions were classified and chronologically sorted. From the classification, it is observed that most reviewed studies were conducted to evaluate the thermal performance of buildings or detect their defects using images collected by an infrared camera. At the same time, a considerable number of studies used thermal images obtained by a satellite to observe the urban heat island effect. Despite the important number of contributions in infrared thermography at multiple scales of the built environment, three main research gaps or opportunities can be identified in the literature. First, it would be possible to perform a more detailed analysis of urban heat fluxes using thermal images collected at multiple scales. Then, thermal images collected by a mounted or handheld infrared camera could be used to create building energy models. Finally, better visualization tools would be developed to monitor a city’s energy use and improve its sustainability if thermal images were integrated into Internet-of-Things and digital twin platforms.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-rser-thermography-review/"&gt;paper&lt;/a&gt;, published open access. &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-rser-thermography-review/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/05/24/new-paper-infrared-thermography-in-the-built-environment-a-multi-scale-review/page-one_hu_c322a6e2da50a6af.webp 400w,
/post/2022/05/24/new-paper-infrared-thermography-in-the-built-environment-a-multi-scale-review/page-one_hu_7d167a6c0fb3eddd.webp 760w,
/post/2022/05/24/new-paper-infrared-thermography-in-the-built-environment-a-multi-scale-review/page-one_hu_675647df39e70aa3.webp 1200w"
src="https://ual.sg/post/2022/05/24/new-paper-infrared-thermography-in-the-built-environment-a-multi-scale-review/page-one_hu_c322a6e2da50a6af.webp"
width="601"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_rser_thermography_review&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Martin, Miguel and Chong, Adrian and Biljecki, Filip and Miller, Clayton}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.rser.2022.112540}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Renewable and Sustainable Energy Reviews}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{112540}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Infrared thermography in the built environment: A multi-scale review}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{165}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Introducing the Global Building Morphology Indicators</title><link>https://ual.sg/post/2022/05/05/introducing-the-global-building-morphology-indicators/</link><pubDate>Thu, 05 May 2022 12:41:37 +0800</pubDate><guid>https://ual.sg/post/2022/05/05/introducing-the-global-building-morphology-indicators/</guid><description>&lt;p&gt;We are pleased to share that our latest project &lt;a href="https://ual.sg/project/gbmi/"&gt;&lt;em&gt;Global Building Morphology Indicators&lt;/em&gt;&lt;/a&gt; has been
published as a namesake article in Computers, Environment and Urban Systems:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Chow YS (2022): Global Building Morphology Indicators. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 95: 101809.
&lt;a href="https://doi.org/10.1016/j.compenvurbsys.2022.101809" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt;10.1016/j.compenvurbsys.2022.101809&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ceus-gbmi/2022-ceus-gbmi.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Urban morphology has been instrumental in a variety of disciplines.
Addressing multiple research gaps related to the building aspect, GBMI is a cohesive set of three contributions in this domain:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A comprehensive catalogue of commonly used metrics characterising the urban building form based on a systematic literature review. We consolidated a set of hundreds of indicators.&lt;/li&gt;
&lt;li&gt;An open-source database solution to implement all these metrics based on building footprints. There are some software approaches to deal with quantitative analyses on the built form, but there isn&amp;rsquo;t one relying on a database, which may be more suited for big data analyses.&lt;/li&gt;
&lt;li&gt;An open dataset with the computed metrics for dozens of urban areas around the world. Such contribution bringing ready-to-use datasets may facilitate analyses, especially comparative ones among multiple study areas.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;figure id="figure-examples-of-aggregated-indicators-that-are-based-on-summary-statistics-from-an-array-of-values-such-as-building-heights-each-of-these-indicators-has-several-counterparts-pertaining-to-the-same-array-of-values-such-as-minimum-value-and-standard-deviation"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Examples of aggregated indicators that are based on summary statistics from an array of values such as building heights. Each of these indicators has several counterparts pertaining to the same array of values, such as minimum value and standard deviation." srcset="
/post/2022/05/05/introducing-the-global-building-morphology-indicators/zone-level2_hu_3c6c23dd2759746b.webp 400w,
/post/2022/05/05/introducing-the-global-building-morphology-indicators/zone-level2_hu_7b8bacf1a3c20d18.webp 760w,
/post/2022/05/05/introducing-the-global-building-morphology-indicators/zone-level2_hu_2b936626cb773f73.webp 1200w"
src="https://ual.sg/post/2022/05/05/introducing-the-global-building-morphology-indicators/zone-level2_hu_3c6c23dd2759746b.webp"
width="711"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Examples of aggregated indicators that are based on summary statistics from an array of values such as building heights. Each of these indicators has several counterparts pertaining to the same array of values, such as minimum value and standard deviation.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The paper also includes a variety of analyses to give a peek in the work and hint at some potential applications.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-the-relationship-between-the-complexity-of-buildings-and-their-normalised-number-in-a-zone-these-two-aggregated-indicators-are-usually-negatively-correlated-with-some-exceptions"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="The relationship between the complexity of buildings and their normalised number in a zone. These two aggregated indicators are usually negatively correlated, with some exceptions." srcset="
/post/2022/05/05/introducing-the-global-building-morphology-indicators/complexity_corr_hu_56a0fb8ac3cecc01.webp 400w,
/post/2022/05/05/introducing-the-global-building-morphology-indicators/complexity_corr_hu_8870000f4027103b.webp 760w,
/post/2022/05/05/introducing-the-global-building-morphology-indicators/complexity_corr_hu_cd613abb6c6ff3f4.webp 1200w"
src="https://ual.sg/post/2022/05/05/introducing-the-global-building-morphology-indicators/complexity_corr_hu_56a0fb8ac3cecc01.webp"
width="760"
height="173"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
The relationship between the complexity of buildings and their normalised number in a zone. These two aggregated indicators are usually negatively correlated, with some exceptions.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The project has a dedicated &lt;a href="https://ual.sg/project/gbmi"&gt;website&lt;/a&gt;.
Please visit it to learn more about the metrics, software, and dataset.&lt;/p&gt;
&lt;p&gt;This project &lt;a href="https://ual.sg/post/2021/05/23/nus-urban-analytics-lab-scales-research-globally-with-aws/"&gt;has been featured also by Amazon at their AWS Public Sector Blog&lt;/a&gt;.
We are grateful for their support.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Characterising and analysing urban morphology is a continuous task in urban data science, environmental analyses, and many other domains. As the availability and quality of data on them have been increasing, buildings have gained more attention. However, tools and data facilitating large-scale studies, together with an interdisciplinary consensus on metrics, remain scarce and often inadequate. We present Global Building Morphology Indicators (GBMI) — a three-pronged contribution addressing such shortcomings: (i) a comprehensive list of hundreds of building form multi-scale measures derived through a systematic literature review; (ii) a methodology and tool for the computation of these metrics in a database suited for big data and comparative studies, and release the code freely and open-source; and (iii) we carry out the computations using high performance computing, generating a public repository with data quantifying the form of selected urban areas around the world, and demonstrate their value with novel analyses comparing morphological parameters across cities. GBMI introduces a formalised, structured, modular, and extensible method to compute, manage, and disseminate urban indicators at a large scale and high resolution, while the precomputed dataset facilitates comparative studies. The theory and implementation traverse multiple scales: at the building level, both individual and contextual ones based on encircling buildings by multiple buffers, and aggregations at several hierarchical administrative levels and at multiple grids. Our open dataset, comprising billions of records on a growing scope of urban areas worldwide, is the most comprehensive instance of morphological data parametrising the individual building stock, supporting studies in urban analytics and a range of disciplines.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ceus-gbmi/"&gt;paper&lt;/a&gt; (open
access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ceus-gbmi/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/05/05/introducing-the-global-building-morphology-indicators/page-one_hu_fe01f6122600f29e.webp 400w,
/post/2022/05/05/introducing-the-global-building-morphology-indicators/page-one_hu_aee8517ca3b1b76.webp 760w,
/post/2022/05/05/introducing-the-global-building-morphology-indicators/page-one_hu_81a0271db25568ec.webp 1200w"
src="https://ual.sg/post/2022/05/05/introducing-the-global-building-morphology-indicators/page-one_hu_fe01f6122600f29e.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ceus_gbmi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Biljecki, Filip and Chow, Yoong Shin}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2022.101809}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{101809}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Global Building Morphology Indicators}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{95}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Population estimation beyond counts—Inferring demographic characteristics</title><link>https://ual.sg/post/2022/04/06/new-paper-population-estimation-beyond-countsinferring-demographic-characteristics/</link><pubDate>Wed, 06 Apr 2022 19:55:16 +0800</pubDate><guid>https://ual.sg/post/2022/04/06/new-paper-population-estimation-beyond-countsinferring-demographic-characteristics/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Szarka N, Biljecki F (2022): Population estimation beyond counts—Inferring demographic characteristics. &lt;em&gt;PLOS ONE&lt;/em&gt; 17(4): e0266484. &lt;a href="https://doi.org/10.1371/journal.pone.0266484" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1371/journal.pone.0266484&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-plos-population-estimation/2022-plos-population-estimation.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;In this research, population estimation, a traditional research topic in GIS, has been advanced.
Most studies so far have been confined to predicting population counts.
This multi-pronged comparative work involving multiple ML techniques has advanced such traditional techniques by not only estimating the population in an area, but also demographic characteristics behind these counts: average age of residents and share of seniors in a region.&lt;/p&gt;
&lt;p&gt;Congratulations to &lt;a href="https://ual.sg/author/noee-szarka/"&gt;Noée Szarka&lt;/a&gt;, our visiting scholar from the University of Edinburgh, on the great job and the publication of her first first-author paper! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;Also, congrats to her on graduating in the MSc in GIS programme.
Noée has continued her career as a Geospatial Developer at the municipal government of the City of Lucerne Council in Switzerland.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Mapping population distribution at a fine spatial scale is essential for urban studies and planning. Numerous studies, mainly supported by geospatial and statistical methods, have focused primarily on predicting population counts. However, estimating their socio-economic characteristics beyond population counts, such as average age, income, and gender ratio, remains unattended. We enhance traditional population estimation by predicting not only the number of residents in an area, but also their demographic characteristics: average age and the proportion of seniors. By implementing and comparing different machine learning techniques (Random Forest, Support Vector Machines, and Linear Regression) in administrative areas in Singapore, we investigate the use of point of interest (POI) and real estate data for this purpose. The developed regression model predicts the average age of residents in a neighbourhood with a mean error of about 1.5 years (the range of average resident age across Singaporean districts spans approx. 14 years). The results reveal that age patterns of residents can be predicted using real estate information rather than with amenities, which is in contrast to estimating population counts. Another contribution of our work in population estimation is the use of previously unexploited POI and real estate datasets for it, such as property transactions, year of construction, and flat types (number of rooms). Advancing the domain of population estimation, this study reveals the prospects of a small set of detailed and strong predictors that might have the potential of estimating other demographic characteristics such as income.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-plos-population-estimation/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-plos-population-estimation/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/04/06/new-paper-population-estimation-beyond-countsinferring-demographic-characteristics/page-one_hu_c86c2ce4025c40f0.webp 400w,
/post/2022/04/06/new-paper-population-estimation-beyond-countsinferring-demographic-characteristics/page-one_hu_efaa470049651b38.webp 760w,
/post/2022/04/06/new-paper-population-estimation-beyond-countsinferring-demographic-characteristics/page-one_hu_5b2ed5f79bf413ab.webp 1200w"
src="https://ual.sg/post/2022/04/06/new-paper-population-estimation-beyond-countsinferring-demographic-characteristics/page-one_hu_c86c2ce4025c40f0.webp"
width="585"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_plos_population_estimation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Szarka, Noée and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1371/journal.pone.0266484}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{PLOS ONE}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{4}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{e0266484}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Population estimation beyond counts---Inferring demographic characteristics}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{17}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Announcing the Urban Analytics Lab seminar series</title><link>https://ual.sg/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/</link><pubDate>Wed, 30 Mar 2022 20:55:16 +0800</pubDate><guid>https://ual.sg/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/</guid><description>&lt;p&gt;We started an online &lt;a href="https://ual.sg/seminars/"&gt;seminar series&lt;/a&gt;.
Our first guests were &lt;a href="http://3d.bk.tudelft.nl/alabetski" target="_blank" rel="noopener"&gt;Anna Labetski&lt;/a&gt; and &lt;a href="http://3d.bk.tudelft.nl/svitalis" target="_blank" rel="noopener"&gt;Stelios Vitalis&lt;/a&gt; from the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group&lt;/a&gt; at the Delft University of Technology.
The session was moderated by Dr &lt;a href="https://ual.sg/author/marcel-ignatius/"&gt;Marcel Ignatius&lt;/a&gt;, who is the initiator and manager of the series.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/first-seminar_hu_8ce4d81823d1e85f.webp 400w,
/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/first-seminar_hu_dd0fd37871d39b7e.webp 760w,
/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/first-seminar_hu_6dcfc230375f9145.webp 1200w"
src="https://ual.sg/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/first-seminar_hu_8ce4d81823d1e85f.webp"
width="760"
height="147"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The Urban Analytics Lab seminar series is part of our community engagement and to provide a platform leading to interesting discussions, new ideas, and spurring collaborations.
In our seminar series, we give priority to early career researchers.&lt;/p&gt;
&lt;p&gt;The next seminar will be given by &lt;a href="https://irl.ethz.ch/people/person-detail.MTY4ODA4.TGlzdC8xNzM4LC0xMzk1OTgzMDM3.html" target="_blank" rel="noopener"&gt;Sergio Wicki&lt;/a&gt; from the &lt;a href="https://plus.ethz.ch" target="_blank" rel="noopener"&gt;Planning of Landscape and Urban Systems&lt;/a&gt; at ETH Zurich (see the poster below).
You are welcome to register &lt;a href="https://forms.gle/HttoRensJuLbAGDz7" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;For further developments and announcements, please bookmark &lt;a href="https://ual.sg/seminars/"&gt;the page of the seminar series&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/featured_hu_931c18de57486fb3.webp 400w,
/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/featured_hu_5339cf0b9d52b8b1.webp 760w,
/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/featured_hu_5ae7fa0554b2c179.webp 1200w"
src="https://ual.sg/post/2022/03/30/announcing-the-urban-analytics-lab-seminar-series/featured_hu_931c18de57486fb3.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Welcome to our three new research fellows</title><link>https://ual.sg/post/2022/03/29/welcome-to-our-three-new-research-fellows/</link><pubDate>Tue, 29 Mar 2022 12:55:16 +0800</pubDate><guid>https://ual.sg/post/2022/03/29/welcome-to-our-three-new-research-fellows/</guid><description>&lt;p&gt;Our research group is delighted to welcome three postdoctoral research fellows to the team: Dr &lt;a href="https://ual.sg/author/marcel-ignatius/"&gt;Marcel Ignatius&lt;/a&gt;, Dr &lt;a href="https://ual.sg/author/pengyuan-liu/"&gt;Pengyuan Liu&lt;/a&gt;, and Dr &lt;a href="https://ual.sg/author/mario-frei/"&gt;Mario Frei&lt;/a&gt;.
They will be working on a new project on digital twins at our Lab.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/03/29/welcome-to-our-three-new-research-fellows/featured_hu_527d478d99841ad9.webp 400w,
/post/2022/03/29/welcome-to-our-three-new-research-fellows/featured_hu_57c0a057a846db31.webp 760w,
/post/2022/03/29/welcome-to-our-three-new-research-fellows/featured_hu_d622cfd3df99668b.webp 1200w"
src="https://ual.sg/post/2022/03/29/welcome-to-our-three-new-research-fellows/featured_hu_527d478d99841ad9.webp"
width="760"
height="447"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: GANmapper - Geographical Data Translation</title><link>https://ual.sg/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/</link><pubDate>Wed, 09 Mar 2022 18:12:16 +0800</pubDate><guid>https://ual.sg/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wu AN, Biljecki F (2022): GANmapper: geographical data translation. &lt;em&gt;International Journal of Geographical Information Science&lt;/em&gt; 36(7): 1394-1422. &lt;a href="https://doi.org/10.1080/13658816.2022.2041643" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1080/13658816.2022.2041643&lt;/a&gt; &lt;a href="https://ual.sg/publication/2022-ijgis-ganmapper/2022-ijgis-ganmapper.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Congratulations to &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt; on this very nice job and his new first-author paper! &amp;#x1f64c;&lt;/p&gt;
&lt;p&gt;In this paper, we take GAN to new heights in GIScience, by introducing a new application: geographical data translation.
Leveraging on their relationship, our contribution uses more commonly found and coarse geospatial data (land use and road network) to predict less common features at the finer scale (building footprints) without data and measurements on them (e.g. satellite imagery or land surveying).&lt;/p&gt;
&lt;p&gt;We developed a software, &lt;a href="https://github.com/ualsg/GANmapper" target="_blank" rel="noopener"&gt;GANmapper&lt;/a&gt;, to translate road network data into building footprint data to demonstrate the feasibility and effectiveness of geographical data translation.
It is released open-source.&lt;/p&gt;
&lt;p&gt;The experiments suggest a high degree of veracity across different urban morphologies around the world (see the &lt;a href="#cityall"&gt;figure below&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;We hope that this new application of GAN in GIS would introduce a new approach for digital cartography and may catalyse further investigations how can GAN be leveraged in GIScience.&lt;/p&gt;
&lt;figure id="figure-cityall"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Synthetic results of GANmapper, together with the reference data, across several diverse cities around the world" srcset="
/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/cityall_hu_9a205348d0707961.webp 400w,
/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/cityall_hu_ca52bfa65817e9dd.webp 760w,
/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/cityall_hu_e6e7391d0e83f7f8.webp 1200w"
src="https://ual.sg/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/cityall_hu_9a205348d0707961.webp"
width="681"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Synthetic results of GANmapper, together with the reference data, across several diverse cities around the world
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;We present a new method to create spatial data using a generative adversarial network (GAN).
Our contribution uses coarse and widely available geospatial data to create maps of less available features at the finer scale in the built environment, bypassing their traditional acquisition techniques (e.g. satellite imagery or land surveying).
In the work, we employ land use data and road networks as input to generate building footprints and conduct experiments in 9 cities around the world.
The method, which we implement in a tool we release openly, enables the translation of one geospatial dataset to another with high fidelity and morphological accuracy.
It may be especially useful in locations missing detailed and high-resolution data and those that are mapped with uncertain or heterogeneous quality, such as much of OpenStreetMap.
The quality of the results is influenced by the urban form and scale.
In most cases, the experiments suggest promising performance as the method tends to truthfully indicate the locations, amount, and shape of buildings.
The work has the potential to support several applications, such as energy, climate, and urban morphology studies in areas previously lacking required data or inpainting geospatial data in regions with incomplete data.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2022-ijgis-ganmapper/"&gt;paper&lt;/a&gt; or access it directly at the &lt;a href="https://doi.org/10.1080/13658816.2022.2041643" target="_blank" rel="noopener"&gt;publisher&amp;rsquo;s website&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="code"&gt;Code&lt;/h3&gt;
&lt;p&gt;The code is shared openly on our &lt;a href="https://github.com/ualsg/GANmapper" target="_blank" rel="noopener"&gt;Github repository&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2022-ijgis-ganmapper/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/page-one_hu_c137c257235f7cea.webp 400w,
/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/page-one_hu_a84cc2d8ad13ff08.webp 760w,
/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/page-one_hu_c63c3b422b9b6881.webp 1200w"
src="https://ual.sg/post/2022/03/09/new-paper-ganmapper-geographical-data-translation/page-one_hu_c137c257235f7cea.webp"
width="561"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2022_ijgis_ganmapper&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2022}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wu, Abraham Noah and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{GANmapper: geographical data translation}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{International Journal of Geographical Information Science}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{36}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{7}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1394-1422}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1080/13658816.2022.2041643}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>A round-up of 2021</title><link>https://ual.sg/post/2021/12/24/a-round-up-of-2021/</link><pubDate>Fri, 24 Dec 2021 15:35:16 +0800</pubDate><guid>https://ual.sg/post/2021/12/24/a-round-up-of-2021/</guid><description>&lt;p&gt;As we are approaching the end of the year, it is time to reflect and be grateful for all the progress we made in 2021.
As a new research group, we are happy to have consolidated our research directions and set a strong foundations for the years to come.&lt;/p&gt;
&lt;p&gt;The list of our papers published so far is &lt;a href="https://ual.sg/publication/"&gt;here&lt;/a&gt;, while our news items are available at our &lt;a href="https://ual.sg/post/"&gt;blog&lt;/a&gt;.
We also released a couple of &lt;a href="https://ual.sg/data-code/"&gt;software packages and datasets&lt;/a&gt;, that may be of interest to the wider community.
Our team has also been enriched &lt;a href="https://ual.sg/post/2021/08/31/welcome-to-our-new-researchers/"&gt;by new researchers who have joined us this year&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Our work has resulted also in &lt;a href="https://ual.sg/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/"&gt;invited talks&lt;/a&gt;, &lt;a href="https://ual.sg/post/2021/12/22/teaching-excellence-award/"&gt;awards&lt;/a&gt;, press releases, &lt;a href="https://ual.sg/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/"&gt;an editorial board appointment&lt;/a&gt;, and &lt;a href="https://ual.sg/post/2021/02/17/publication-of-the-collection-emerging-topics-in-3d-gis/"&gt;a special issue&lt;/a&gt;.
Above all, we had lots of fun doing research together.&lt;/p&gt;
&lt;p&gt;A study led by Stanford University has identified top 2% scientists worldwide in a variety of fields.
The &lt;a href="https://ual.sg/authors/filip/"&gt;PI&lt;/a&gt; of the Lab is on the list.
For more information about the study, see the dataset &lt;a href="https://elsevier.digitalcommonsdata.com/datasets/btchxktzyw/3" target="_blank" rel="noopener"&gt;&amp;ldquo;Updated science-wide author databases of standardized citation indicators&amp;rdquo;&lt;/a&gt; and their papers published in PLOS Biology indicating the motivation and methodology (&lt;a href="https://doi.org/10.1371/journal.pbio.3000384" target="_blank" rel="noopener"&gt;here&lt;/a&gt; and &lt;a href="https://doi.org/10.1371/journal.pbio.3000918" target="_blank" rel="noopener"&gt;here&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;We are especially happy to celebrate the achievements of our master students, mostly based on their &lt;a href="https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/"&gt;graduation projects&lt;/a&gt;.
Four papers first-authored or co-authored by master students have been published in leading journals this year, such as &lt;a href="https://www.journals.elsevier.com/computers-environment-and-urban-systems" target="_blank" rel="noopener"&gt;Computers, Environment and Urban Systems&lt;/a&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Palliwal A, Song S, Tan HTW, Biljecki F (2021): 3D city models for urban farming site identification in buildings. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 86: 101584. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2020.101584" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2020.101584&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-ceus-3-d-farming/2021-ceus-3-d-farming.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen W, Wu AN, Biljecki F (2021): Classification of Urban Morphology with Deep Learning: Application on Urban Vitality. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 90: 101706. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2021.101706" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2021.101706&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-ceus-dl-morphology/2021-ceus-dl-morphology.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Ito K, Biljecki F (2021): Assessing bikeability with street view imagery and computer vision. &lt;em&gt;Transportation Research Part C: Emerging Technologies&lt;/em&gt; 132: 103371. &lt;a href="https://doi.org/10.1016/j.trc.2021.103371" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.trc.2021.103371&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-trc-bikeability/2021-trc-bikeability.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Ito K (2021): Street view imagery in urban analytics and GIS: A review. &lt;em&gt;Landscape and Urban Planning&lt;/em&gt; 215: 104217. &lt;a href="https://doi.org/10.1016/j.landurbplan.2021.104217" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2021.104217&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-land-svi-review/2021-land-svi-review.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The National University of Singapore continued performing very well in rankings, with QS &lt;a href="https://news.nus.edu.sg/nus-ranked-top-10-globally-in-16-subjects/" target="_blank" rel="noopener"&gt;considering&lt;/a&gt; it among the world&amp;rsquo;s best universities and &lt;a href="https://ual.sg/post/2021/03/04/our-department-is-among-the-top-10-globally-and-best-in-asia/"&gt;our department among the top 10 in the world&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;2021 is also the last year of our NUS School of Design and Environment, which has played an important role in developing Singapore&amp;rsquo;s built environment research and education over the past five decades.
As of 1 January 2022, it will be merged with the NUS Faculty of Engineering, starting a new exciting chapter: the &lt;a href="https://cde.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS College of Design and Engineering&lt;/a&gt;.
Read more about this fusion in the &lt;a href="https://news.nus.edu.sg/two-new-colleges-at-nus-to-deliver-flexible-interdisciplinary-education-more-accessibly-and-at-greater-scale/" target="_blank" rel="noopener"&gt;press release by NUS&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We wish everyone a new year filled with success, happiness, and prosperity!&lt;/p&gt;
&lt;p&gt;You can follow our work also on &lt;a href="http://twitter.com/urbanalyticslab" target="_blank" rel="noopener"&gt;Twitter&lt;/a&gt; and &lt;a href="https://www.linkedin.com/company/urban-analytics-lab/" target="_blank" rel="noopener"&gt;Linkedin&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Teaching Excellence Award</title><link>https://ual.sg/post/2021/12/22/teaching-excellence-award/</link><pubDate>Wed, 22 Dec 2021 07:15:16 +0800</pubDate><guid>https://ual.sg/post/2021/12/22/teaching-excellence-award/</guid><description>&lt;p&gt;Asst Prof &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; was awarded the Teaching Excellence Award by the NUS School of Design and Environment.
The award is presented in recognition of excellence, dedication, and commitment to the education of students at NUS in AY2020/2021, based on the three graduate modules in urban data science in which our Lab is involved and other activities such as &lt;a href="https://ual.sg/opportunities/student-projects/"&gt;supervision of master theses&lt;/a&gt;.
It was conferred on him by Prof Lam Khee Poh, Dean of SDE.&lt;/p&gt;
&lt;p&gt;&amp;#x1f389;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/12/22/teaching-excellence-award/1_hu_f61ff5853579f2a8.webp 400w,
/post/2021/12/22/teaching-excellence-award/1_hu_c05e08cb928b55a2.webp 760w,
/post/2021/12/22/teaching-excellence-award/1_hu_202583231db470e.webp 1200w"
src="https://ual.sg/post/2021/12/22/teaching-excellence-award/1_hu_f61ff5853579f2a8.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New paper: Operational characteristics of residential air conditioners with temporally granular remote thermographic imaging</title><link>https://ual.sg/post/2021/11/30/new-paper-operational-characteristics-of-residential-air-conditioners-with-temporally-granular-remote-thermographic-imaging/</link><pubDate>Tue, 30 Nov 2021 13:55:16 +0800</pubDate><guid>https://ual.sg/post/2021/11/30/new-paper-operational-characteristics-of-residential-air-conditioners-with-temporally-granular-remote-thermographic-imaging/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper in which we were involved:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Arjunan P, Dobler G, Lee K, Miller C, Biljecki F, Poolla K (2021): Operational characteristics of residential air conditioners with temporally granular remote thermographic imaging. &lt;em&gt;BuildSys &amp;lsquo;21: Proceedings of the 8th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation&lt;/em&gt;, pp. 184&amp;ndash;187. &lt;a href="https://doi.org/10.1145/3486611.3486659" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1145/3486611.3486659&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-buildsys-acir/2021-buildsys-acir.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This paper, led by &lt;a href="https://www.samy101.com" target="_blank" rel="noopener"&gt;Dr Pandarasamy Arjunan&lt;/a&gt; (Berkeley Education Alliance for Research in Singapore), examines a new application of infrared thermography &amp;ndash; analysing the patterns of residential air conditioners.
The work has been presented at the &lt;a href="https://buildsys.acm.org/2021/" target="_blank" rel="noopener"&gt;BuildSys 2021 conference&lt;/a&gt; in Coimbra, Portugal.
Other co-authors are &lt;a href="https://www.bidenschool.udel.edu/people/gdobler" target="_blank" rel="noopener"&gt;Dr Gregory Dobler&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/kyungmin-lee-477b5b144/en" target="_blank" rel="noopener"&gt;Kyungmin Lee&lt;/a&gt; (University of Delaware), &lt;a href="https://www.budslab.org" target="_blank" rel="noopener"&gt;Clayton Miller&lt;/a&gt; (NUS Building and Urban Data Science Lab), and Prof &lt;a href="https://en.wikipedia.org/wiki/Kameshwar_Poolla" target="_blank" rel="noopener"&gt;Kameshwar Poolla&lt;/a&gt; (University of California, Berkeley).&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Collecting accurate information about the operational characteristics of buildings is essential for many performance analyses including energy auditing and benchmarking. While most of the existing solutions involve manual inspection of building premises, surveys, or in-situ sensor deployment, these techniques are intrusive and/or time-intensive, limiting their applicability at larger, city-level spatial scales. Complementing these solutions, we present a wide-area infrared thermography (IRT) based system for measuring a building&amp;rsquo;s operational characteristics remotely in a non-intrusive and scalable way, and we describe the image capture, processing pipeline, and preliminary results from an initial deployment of this system focusing on the operational characteristics of air-conditioning units. The data collection consisted of infrared imaging of a residential building for 50 days at continuous 10-second intervals. The infrared brightness variations of exterior split air-conditioners fixtures were extracted for each image, and operational attributes were then extracted from the resultant time series. Using state-based change point detection methods to determine times at which air-conditioners are operational, our preliminary analysis focuses on phenomenological patterns of activity with two main findings. First, we demonstrate our ability to determine the operational characteristics of air-conditioners and in particular the fact that they exhibit two distinct modes: continuous operation and cycling behavior. Second, we find that the fraction of &amp;ldquo;on&amp;rdquo; time spent in the cycling mode is characteristically longer for lower external temperatures, consistent with the hypothesis that the cycling mode represents a reaching of set temperature. Finally, we outline future research directions and challenges in leveraging IRT for behavioral studies of cooling system use and proactive assessment of air-conditioners towards the development of scalable virtual energy auditing.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-buildsys-acir/"&gt;paper&lt;/a&gt;, published as open access.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-buildsys-acir/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/11/30/new-paper-operational-characteristics-of-residential-air-conditioners-with-temporally-granular-remote-thermographic-imaging/page-one_hu_96758088880b63dd.webp 400w,
/post/2021/11/30/new-paper-operational-characteristics-of-residential-air-conditioners-with-temporally-granular-remote-thermographic-imaging/page-one_hu_4c475d4d03441f34.webp 760w,
/post/2021/11/30/new-paper-operational-characteristics-of-residential-air-conditioners-with-temporally-granular-remote-thermographic-imaging/page-one_hu_a42bbb6c533679b0.webp 1200w"
src="https://ual.sg/post/2021/11/30/new-paper-operational-characteristics-of-residential-air-conditioners-with-temporally-granular-remote-thermographic-imaging/page-one_hu_96758088880b63dd.webp"
width="587"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@inproceedings&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_buildsys_acir&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Arjunan, Pandarasamy and Dobler, Gregory and Lee, Kyungmin and Miller, Clayton and Biljecki, Filip and Poolla, Kameshwar}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Operational characteristics of residential air conditioners with temporally granular remote thermographic imaging}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;booktitle&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{BuildSys &amp;#39;21: Proceedings of the 8th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;isbn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9781450391146}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1145/3486611.3486659}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{184--187}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;series&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Proceedings of the 8th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: The Internet-of-Buildings (IoB) --- Digital twin convergence of wearable and IoT data with GIS/BIM</title><link>https://ual.sg/post/2021/11/19/new-paper-the-internet-of-buildings-iob---digital-twin-convergence-of-wearable-and-iot-data-with-gis/bim/</link><pubDate>Fri, 19 Nov 2021 18:55:16 +0800</pubDate><guid>https://ual.sg/post/2021/11/19/new-paper-the-internet-of-buildings-iob---digital-twin-convergence-of-wearable-and-iot-data-with-gis/bim/</guid><description>&lt;p&gt;We are glad to share a new collaborative paper in which we were involved:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Miller C, Abdelrahman M, Chong A, Biljecki F, Quintana M, Frei M, Chew M, Wong D (2021): The Internet-of-Buildings (IoB) &amp;mdash; Digital twin convergence of wearable and IoT data with GIS/BIM. &lt;em&gt;Journal of Physics: Conference Series&lt;/em&gt; 2042(1): 012041. &lt;a href="https://doi.org/10.1088/1742-6596/2042/1/012041" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1088/1742-6596/2042/1/012041&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-cisbat-iob/2021-cisbat-iob.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This paper, led by Dr Clayton Miller from the NUS Department of the Built Environment, is a result of a collaboration with our sister labs &amp;mdash; the &lt;a href="https://www.budslab.org" target="_blank" rel="noopener"&gt;Building and Urban Data Science (BUDS) Lab&lt;/a&gt; and the &lt;a href="https://ideaslab.io" target="_blank" rel="noopener"&gt;Integrated Data, Energy Analysis + Simulation (IDEAS) Lab&lt;/a&gt;.
The work has been presented at the &lt;a href="https://cisbat.epfl.ch" target="_blank" rel="noopener"&gt;CISBAT 2021 conference&lt;/a&gt; in Lausanne, Switzerland.&lt;/p&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Internet-of-Things (IoT) devices in buildings and wearable technologies for occupants are quickly becoming widespread. These technologies provide copious amounts of high-quality temporal data pertaining to indoor and outdoor environmental quality, comfort, and energy consumption. However, a barrier to their use in many applications is the lack of spatial context in the built environment. Adding Building Information Models (BIM) and Geographic Information Systems (GIS) to these temporal sources unleashes potential. We call this data convergence the Internet-of-Buildings or IoB. In this paper, a digital twin case study of data intersection from various systems is outlined. Initial insights are discussed for an experiment with 17 participants that focused on the collection of occupant subjective feedback to characterize indoor comfort. The results illustrate the ability to capture data from wearables in the context of a BIM data environment.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-cisbat-iob/"&gt;paper&lt;/a&gt;, published as open access.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-cisbat-iob/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/11/19/new-paper-the-internet-of-buildings-iob---digital-twin-convergence-of-wearable-and-iot-data-with-gis/bim/page-one_hu_ec3f29d486fc46f5.webp 400w,
/post/2021/11/19/new-paper-the-internet-of-buildings-iob---digital-twin-convergence-of-wearable-and-iot-data-with-gis/bim/page-one_hu_6dd8174ecc013098.webp 760w,
/post/2021/11/19/new-paper-the-internet-of-buildings-iob---digital-twin-convergence-of-wearable-and-iot-data-with-gis/bim/page-one_hu_1aae3be5a755de0.webp 1200w"
src="https://ual.sg/post/2021/11/19/new-paper-the-internet-of-buildings-iob---digital-twin-convergence-of-wearable-and-iot-data-with-gis/bim/page-one_hu_ec3f29d486fc46f5.webp"
width="559"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_cisbat_iob&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Miller, Clayton and Abdelrahman, Mahmoud and Chong, Adrian and Biljecki, Filip and Quintana, Matias and Frei, Mario and Chew, Michael and Wong, Daniel}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1088/1742-6596/2042/1/012041}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Journal of Physics: Conference Series}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{012041}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{The Internet-of-Buildings (IoB) --- Digital twin convergence of wearable and IoT data with GIS/BIM}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2042}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>The PI of the Lab was appointed to the Editorial Board of Landscape and Urban Planning</title><link>https://ual.sg/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/</link><pubDate>Wed, 03 Nov 2021 09:55:16 +0800</pubDate><guid>https://ual.sg/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/</guid><description>&lt;p&gt;&lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; was appointed as member of the &lt;a href="https://www.sciencedirect.com/journal/landscape-and-urban-planning/about/editorial-board" target="_blank" rel="noopener"&gt;Editorial Board&lt;/a&gt; of &lt;a href="https://www.sciencedirect.com/journal/landscape-and-urban-planning" target="_blank" rel="noopener"&gt;Landscape and Urban Planning&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Landscape and Urban Planning&lt;/em&gt; is an international journal aimed at advancing conceptual, scientific, and applied understandings of landscape in order to promote sustainable solutions for landscape change.
The journal, published by Elsevier, was established in 1974 as &lt;em&gt;Landscape Planning&lt;/em&gt;, and since then it ascended to rank within top 1% journals in the category of Urban Studies according to Scopus.&lt;/p&gt;
&lt;p&gt;In the past few years, the journal has published scores of highly relevant articles in domains coinciding with the interests of our &lt;a href="https://ual.sg/"&gt;Lab&lt;/a&gt;, such as 3D city modelling, urban form, and street view imagery.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/featured_hu_d7692b2317a6936.webp 400w,
/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/featured_hu_43f5ee67314d9fe2.webp 760w,
/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/featured_hu_a913d4abe298af95.webp 1200w"
src="https://ual.sg/post/2021/11/03/the-pi-of-the-lab-was-appointed-to-the-editorial-board-of-landscape-and-urban-planning/featured_hu_d7692b2317a6936.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Exploration of research data repositories</title><link>https://ual.sg/post/2021/10/25/exploration-of-research-data-repositories/</link><pubDate>Mon, 25 Oct 2021 09:34:24 +0800</pubDate><guid>https://ual.sg/post/2021/10/25/exploration-of-research-data-repositories/</guid><description>&lt;p&gt;It is inevitable these days that researchers depend heavily on various kinds and sources of data. They usually start from one or more sources of input data, and some generate datasets that are contributed as open data, available to the public upon the publication of papers. In some cases, these open data may be in multiple versions as data being updated, or new data is added. Customisable access control may be a requirement if the research data is part of the peer-review process as the research paper. Thus, it is critical when choosing a data repository, we evaluate it against our use case and data management workflow and requirements.&lt;/p&gt;
&lt;p&gt;While searching for a data repository solution for our projects, we find the information regarding research data repositories and open data repositories are scattered over the internet. We decided to compile a list of repositories that we found relevant to our field, perform a comparison among them. We hope that other research groups will find this exploration useful when deciding how to share open data arising from their projects.&lt;/p&gt;
&lt;p&gt;In general, there are four categories of research data repository options:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;data repositories associated with academic institutes,&lt;/li&gt;
&lt;li&gt;generic open data repositories,&lt;/li&gt;
&lt;li&gt;open data registry backed by private companies, and&lt;/li&gt;
&lt;li&gt;self-hosted&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;There is an endless list of options for each of these categories, so we have selected a few major ones in each to compare and contrast. The comparison is performed based on the following aspects that are important to us:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;supported file format&lt;/li&gt;
&lt;li&gt;file size limitation&lt;/li&gt;
&lt;li&gt;total storage space limitation&lt;/li&gt;
&lt;li&gt;version control&lt;/li&gt;
&lt;li&gt;access control&lt;/li&gt;
&lt;li&gt;data license requirements&lt;/li&gt;
&lt;li&gt;cost&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="academic-data-repositories"&gt;Academic Data Repositories&lt;/h3&gt;
&lt;p&gt;These repositories are usually set up within an academic institute to host the research publications and/or research data of research labs in the institute. While such systems are set up within the institute, some, like Dataverse at Harvard University and ICPSR at the University of Michigan, are open to host publications and data contributed by researchers from other academic institutes.&lt;/p&gt;
&lt;p&gt;Here we compare ScholarBank@NUS, Dataverse@Harvard and ICPSR@UMich.&lt;/p&gt;
&lt;figure id="figure-comparison-of-academic-repositories-click-hereacademic-reposjpg-for-the-image-in-full-size"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Comparison of academic repositories. Click [here](academic-repos.jpg) for the image in full-size." srcset="
/post/2021/10/25/exploration-of-research-data-repositories/academic-repos_hu_e94e8c1c23fb41e0.webp 400w,
/post/2021/10/25/exploration-of-research-data-repositories/academic-repos_hu_9ddf21c6d9a9a269.webp 760w,
/post/2021/10/25/exploration-of-research-data-repositories/academic-repos_hu_26abf78bce654efe.webp 1200w"
src="https://ual.sg/post/2021/10/25/exploration-of-research-data-repositories/academic-repos_hu_e94e8c1c23fb41e0.webp"
width="749"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Comparison of academic repositories. Click &lt;a href="academic-repos.jpg"&gt;here&lt;/a&gt; for the image in full-size.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="general-research-data-repositories"&gt;General Research Data Repositories&lt;/h3&gt;
&lt;p&gt;Other than the academic research repositories, there other types of general research data repositories. These are generally partnered with publishers, associated with government sectors, or backed by non-profits organisations.&lt;/p&gt;
&lt;p&gt;In comparison with the academic institute backed research repositories, most of these repositories could offer free data deposition/maintenance services, while some take donations, or offer membership/subscription for higher allowance of storage. Below is the comparison between FigShare, Dryad, Zenodo, Open Science Framework (OSF), Pangeae and Mendeley Data.&lt;/p&gt;
&lt;figure id="figure-comparison-of-general-repositories-click-heregeneric-reposjpg-for-the-image-in-full-size"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Comparison of general repositories. Click [here](generic-repos.jpg) for the image in full-size." srcset="
/post/2021/10/25/exploration-of-research-data-repositories/generic-repos_hu_5a4ea69bd18968bf.webp 400w,
/post/2021/10/25/exploration-of-research-data-repositories/generic-repos_hu_fcea1ac7175e7c96.webp 760w,
/post/2021/10/25/exploration-of-research-data-repositories/generic-repos_hu_7cbcf361d3584a3d.webp 1200w"
src="https://ual.sg/post/2021/10/25/exploration-of-research-data-repositories/generic-repos_hu_5a4ea69bd18968bf.webp"
width="574"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Comparison of general repositories. Click &lt;a href="generic-repos.jpg"&gt;here&lt;/a&gt; for the image in full-size.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="open-data-repositories-backed-by-private-companies"&gt;Open Data Repositories Backed by Private Companies&lt;/h3&gt;
&lt;p&gt;Many private companies have also shown support in the Open Data movement and offer hosting
and registry of various datasets. The terms and conditions for these data repositories are not
usually publicly accessible, and need to be negotiated with the company who backed them
respectively.&lt;/p&gt;
&lt;p&gt;Here are the options we looked into open data solution offered by Amazon Web Services, Google
Cloud, and Microsoft.&lt;/p&gt;
&lt;figure id="figure-comparison-of-repositories-provided-by-private-companies-click-hereprivate-reposjpg-for-the-image-in-full-size"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Comparison of repositories provided by private companies. Click [here](private-repos.jpg) for the image in full-size." srcset="
/post/2021/10/25/exploration-of-research-data-repositories/private-repos_hu_bc20398511962e72.webp 400w,
/post/2021/10/25/exploration-of-research-data-repositories/private-repos_hu_5f9da9366e7ce121.webp 760w,
/post/2021/10/25/exploration-of-research-data-repositories/private-repos_hu_582075b3cad1d57c.webp 1200w"
src="https://ual.sg/post/2021/10/25/exploration-of-research-data-repositories/private-repos_hu_bc20398511962e72.webp"
width="760"
height="167"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Comparison of repositories provided by private companies. Click &lt;a href="private-repos.jpg"&gt;here&lt;/a&gt; for the image in full-size.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="self-hosting"&gt;Self-Hosting&lt;/h3&gt;
&lt;p&gt;Lastly, there is always the option to self-host the research data. This option may offer more flexibility, freedom and ease to update the dataset as we wish, but it requires the development and maintenance of a data access portal, and technical maintenance of a web server, in addition to research data management. Depending on the simplicity of the data access portal, advanced features such as download statistics, version control and user access module could require effort to implement.&lt;/p&gt;
&lt;h3 id="summary"&gt;Summary&lt;/h3&gt;
&lt;p&gt;Although this list is not complete with all the research data repositories available, we hope this provides enough viable options for our future research and those who are looking for open data repository solutions. We also welcome other suggestions that we may have overlooked.&lt;/p&gt;
&lt;p&gt;For those who are interested in more detail, the comparison can be found in a spreadsheet &lt;a href="https://docs.google.com/spreadsheets/d/1XTWrlJrfxWs1I5_fEZjnw0H_H5qDtiAmafvTh4QRBhg/edit?usp=sharing" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>New paper: Assessing bikeability with street view imagery and computer vision</title><link>https://ual.sg/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/</link><pubDate>Mon, 20 Sep 2021 17:55:16 +0800</pubDate><guid>https://ual.sg/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/</guid><description>&lt;p&gt;We are glad to share our new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Ito K, Biljecki F (2021): Assessing bikeability with street view imagery and computer vision. &lt;em&gt;Transportation Research Part C: Emerging Technologies&lt;/em&gt; 132: 103371. &lt;a href="https://doi.org/10.1016/j.trc.2021.103371" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.trc.2021.103371&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-trc-bikeability/2021-trc-bikeability.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;In this research, a new approach to assess bikeability has been developed: automatically, at a high resolution, at a large spatial scale, and from street view imagery.&lt;/p&gt;
&lt;p&gt;Congratulations to &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt;, our Master of Urban Planning graduate, on the great job and the publication of his first first-author paper! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;Transportation Research Part C: Emerging Technologies is a top 1% journal in its discipline according to Scopus, and this is Koichi&amp;rsquo;s second paper overall in the same month, with the first one being a &lt;a href="https://ual.sg/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/"&gt;review paper published on street view imagery&lt;/a&gt;, in another top journal.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Understanding and scoring bikeability is crucial in urban transportation planning.&lt;/li&gt;
&lt;li&gt;Studies so far have largely relied on field visits and manual work.&lt;/li&gt;
&lt;li&gt;Street-level images and computer vision techniques are seldom used in bikeability assessment.&lt;/li&gt;
&lt;li&gt;First and most comprehensive study investigating the usability of these techniques.&lt;/li&gt;
&lt;li&gt;With some caveats, conventional approaches may be replaced with automated virtual audits.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Studies evaluating bikeability usually compute spatial indicators shaping cycling conditions and conflate them in a quantitative index. Much research involves site visits or conventional geospatial approaches, and few studies have leveraged street view imagery (SVI) for conducting virtual audits. These have assessed a limited range of aspects, and not all have been automated using computer vision (CV). Furthermore, studies have not yet zeroed in on gauging the usability of these technologies thoroughly. We investigate, with experiments at a fine spatial scale and across multiple geographies (Singapore and Tokyo), whether we can use SVI and CV to assess bikeability comprehensively. Extending related work, we develop an exhaustive index of bikeability composed of 34 indicators. The results suggest that SVI and CV are adequate to evaluate bikeability in cities comprehensively. As they outperformed non-SVI counterparts by a wide margin, SVI indicators are also found to be superior in assessing urban bikeability and potentially can be used independently, replacing traditional techniques. However, the paper exposes some limitations, suggesting that the best way forward is combining both SVI and non-SVI approaches. The new bikeability index presents a contribution in transportation and urban analytics, and it is scalable to assess cycling appeal widely.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-trc-bikeability/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Free access to the article is available on &lt;a href="https://authors.elsevier.com/a/1dn8b,M0mRJjVR" target="_blank" rel="noopener"&gt;this link&lt;/a&gt; until 9 November 2021.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-trc-bikeability/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/page-one_hu_a60d77cb75122c2b.webp 400w,
/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/page-one_hu_adb1cade10ae33d3.webp 760w,
/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/page-one_hu_da0d71ae14c2d975.webp 1200w"
src="https://ual.sg/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/page-one_hu_a60d77cb75122c2b.webp"
width="556"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_trc_bikeability&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Koichi Ito and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Assessing bikeability with street view imagery and computer vision}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Transportation Research Part C: Emerging Technologies}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{132}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{103371}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.trc.2021.103371}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;url&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{https://www.sciencedirect.com/science/article/pii/S0968090X21003739}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>UAL researchers develop AI-powered tool to map sustainable roofs globally</title><link>https://ual.sg/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/</link><pubDate>Fri, 10 Sep 2021 08:43:28 +0800</pubDate><guid>https://ual.sg/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/</guid><description>&lt;p&gt;&lt;a href="https://ual.sg/post/2021/06/24/new-paper-roofpedia/"&gt;Our project Roofpedia&lt;/a&gt; was featured as a &lt;a href="https://news.nus.edu.sg/nus-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/" target="_blank" rel="noopener"&gt;news item&lt;/a&gt; by our university.&lt;/p&gt;
&lt;p&gt;Thanks to NUS for sharing our work! &amp;#x1f64f;&lt;/p&gt;
&lt;p&gt;The full release is copied below.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;As cities around the world continue to urbanise, there is a greater need to expand and optimise existing spaces. Cities have accelerated looking into how underutilised rooftop spaces might contribute to climate action, food production, and other purposes. Sustainable roofs, such as those with greenery and photovoltaic panels, can contribute to the roadmap for reducing the carbon footprint of cities but while studies have been done to gauge their potential, few track the actual performance of cities.&lt;/p&gt;
&lt;p&gt;To tackle this, Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, Presidential Young Professor from the &lt;a href="https://www.sde.nus.edu.sg/arch/" target="_blank" rel="noopener"&gt;Department of Architecture&lt;/a&gt; at the National University of Singapore (NUS) &lt;a href="https://www.sde.nus.edu.sg" target="_blank" rel="noopener"&gt;School of Design and Environment&lt;/a&gt;, and NUS Master of Architecture graduate Mr &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt; developed an automated tool that uses satellite images to track how rooftops around the world adopt solar panels and/or vegetation. Known as Roofpedia, it uses a fully convolutional neural network (deep learning) which allows researchers and policymakers to study how cities worldwide are greening their rooftops and using them for photovoltaic installations.&lt;/p&gt;
&lt;p&gt;This is a research project under the &lt;a href="https://ual.sg/"&gt;NUS Urban Analytics Lab&lt;/a&gt;, a multidisciplinary research group at the &lt;a href="https://www.sde.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS School of Design and Environment&lt;/a&gt;. Their research was published in the international journal &lt;a href="https://www.sciencedirect.com/science/article/pii/S0169204621001304?via%3Dihub" target="_blank" rel="noopener"&gt;Landscape and Urban Planning&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="tracking-solar-and-green-roof-adoption-in-17-cities"&gt;Tracking solar and green roof adoption in 17 cities&lt;/h3&gt;
&lt;p&gt;With Roofpedia, the researchers created an open roof registry with data from 1 million rooftops across 17 morphologically and geographically cities, spanning Europe, North America, Australia, and Asia. These cities are: Berlin, Copenhagen, Las Vegas, Los Angeles, Luxembourg City, Melbourne, New York, Paris, Phoenix, Portland, San Diego, San Francisco, San Jose, Seattle, Singapore, Vancouver, and Zurich.&lt;/p&gt;
&lt;p&gt;Using this data, the researchers developed the Roofpedia Index, to benchmark the cities by the extent of sustainable roofscape in terms of solar and green roof penetration. This is derived by considering both the area coverage and the number of buildings equipped with solar and green roofs in a city as a percentage value of the entire area.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-3_hu_4baf92e69bb7416b.webp 400w,
/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-3_hu_55b346243d8ad305.webp 760w,
/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-3_hu_97a209f4fb2db5fd.webp 1200w"
src="https://ual.sg/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-3_hu_4baf92e69bb7416b.webp"
width="760"
height="568"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;Zurich was given an index score of 100 due to its high scores in both area coverage and number of buildings for green roofs. The high green roof coverage is the result of efforts taken by the Zurich City Government in making Green Roofs mandatory for all new buildings since 1991. Las Vegas was tops in the solar roof adoption in the Index with a score of 86. This could be due to the high solar potential of the geographical area.&lt;/p&gt;
&lt;p&gt;“By collecting such data, Roofpedia allows to gauge how cities might further utilise their rooftops to mitigate carbon emissions and how much untapped potential their roofscapes have. For example, users might complement Roofpedia with other sources of data to study the effectiveness of governmental subsidies and whether climate pledges of others have been followed. In addition, by collecting current data through satellite imagery, users can more accurately determine the present carbon offsetting capacity of cities as well,” Dr Biljecki shared.&lt;/p&gt;
&lt;h3 id="roofpedia-ranks-singapore-third-in-solar-roof-coverage"&gt;Roofpedia ranks Singapore third in solar roof coverage&lt;/h3&gt;
&lt;p&gt;Based on the Roofpedia Index, Singapore is ranked third out of the 17 cities, with a score of 75, for rooftop solar adoption, trailing behind Las Vegas (score of 86) and Zurich (score of 81). The scores in the Index are normalised, and while Singapore scored highly on total area coverage, it is behind some other cities for having relatively fewer buildings equipped with solar panels.&lt;/p&gt;
&lt;p&gt;Mr Wu said, “While Singapore has ambitious plans to considerably expand its solar energy deployment by 2030 as part of the SG Green Plan, rooftop solar deployment is largely driven by the government. As such, Roofpedia indicates a higher concentration of solar-enabled buildings in heartland areas such as Woodlands, Jurong, or Ang Mo Kio as compared to other districts like Pasir Panjang. Singapore’s relatively lower score for the number of buildings equipped with solar panels indicates that there is potential in engaging private residential and commercial buildings to further maximise Singapore’s rooftops.”&lt;/p&gt;
&lt;p&gt;Mr Wu elaborated that the scoring system considers both building count and total building area, thus allowing users to study the degree of adoption by individual owners as well as the overall extensiveness of rooftop solar and vegetation cover in other cities.&lt;/p&gt;
&lt;figure id="figure-mr-abraham-noah-wu-left-and-dr-filip-biljecki-right-showing-the-features-of-roofpedia-an-automated-tool-that-they-had-developed-which-uses-satellite-images-to-track-solar-and-green-roof-penetration-credit-nus"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Mr Abraham Noah Wu (left) and Dr Filip Biljecki (right) showing the features of Roofpedia, an automated tool that they had developed, which uses satellite images to track solar and green roof penetration. Credit: NUS." srcset="
/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-1_hu_c0600d0a8198d6d3.webp 400w,
/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-1_hu_ce1216719656a2a1.webp 760w,
/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-1_hu_b740b061774e5e8f.webp 1200w"
src="https://ual.sg/post/2021/09/10/ual-researchers-develop-ai-powered-tool-to-map-sustainable-roofs-globally/1920_20210907roofpedia-1_hu_c0600d0a8198d6d3.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Mr Abraham Noah Wu (left) and Dr Filip Biljecki (right) showing the features of Roofpedia, an automated tool that they had developed, which uses satellite images to track solar and green roof penetration. Credit: NUS.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The researchers emphasised that each city has its unique characteristics, and the exact benefit of greenery or solar panels on rooftops much depends on the urban form and design. The geolocation and macroclimate of the city also plans a part. For instance, in drier areas, green roofs are harder to maintain while in rainy and dark areas, solar roofs may not make economic sense. Taking these limitations into consideration, a city could still be environmentally progressive without a sustainable roofscape.&lt;/p&gt;
&lt;p&gt;Dr Biljecki explained, “Vancouver may not score well in the Index but is nevertheless consistently ranked as one of the most sustainable cities in the world with access to plenty of hydropower, providing for 25 per cent of the city’s energy need alone, and has plans to derive 100 per cent of the energy used from renewable sources before 2050. What the Roofpedia Index does is that it can complement existing sustainability indices by adding a new dimension of consideration in assessing the overall sustainability of a city.”&lt;/p&gt;
&lt;h3 id="the-future-of-roofpedia"&gt;The future of Roofpedia&lt;/h3&gt;
&lt;p&gt;The research team has made Roofpedia’s data of 1 million rooftops openly accessible and hopes to encourage other scholars onboard to collaborate with them to expand the database by tracking more cities or including other environmental indicators.&lt;/p&gt;
&lt;p&gt;Dr Biljecki shared that the accuracy of Roofpedia’s results would depend on the quality and period of satellite images provided as well as whether the approach would distinguish other man-made features (such as skylights) from solar panels. However, when the data is aggregated at the city-scale, Roofpedia can generally indicate how sustainable a city’s rooftops are, enabling cross-city comparative analyses.&lt;/p&gt;
&lt;p&gt;“Our project is designed to be open as cities today are dynamic and rapidly adopting sustainable instruments. In addition, its design is modular, meaning that new geographies, roof typologies, and functions can be added. As such, our research group is planning to add a temporal feature so that users can study the evolution of sustainable rooftop measures over time and how much more cities might increase their roofscapes. We hope that our work can aid researchers, local governments, and the public in understanding and promoting the further use of rooftops in achieving sustainable urban development for a carbon neutral world,” Dr Biljecki said.&lt;/p&gt;
&lt;p&gt;Please click &lt;a href="https://ual.sg/project/roofpedia/"&gt;here&lt;/a&gt; for more information on Roofpedia.&lt;/p&gt;</description></item><item><title>Graduation projects completed at our Lab in 2021</title><link>https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/</link><pubDate>Wed, 01 Sep 2021 05:15:02 +0800</pubDate><guid>https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/</guid><description>&lt;p&gt;We are proud to announce that in the last few months, four students have completed their studies by carrying out a graduation project with us.&lt;/p&gt;
&lt;h3 id="exploiting-real-estate-data-for-the-enrichment-of-spatial-databases"&gt;Exploiting real estate data for the enrichment of spatial databases&lt;/h3&gt;
&lt;p&gt;In her excellent thesis for the degree of MSc in Applied GIS, &lt;a href="https://ual.sg/author/xinyu-chen/"&gt;Xinyu Chen&lt;/a&gt; has researched extracting information on buildings and amenities from latent sources of data &amp;ndash; property transactions and real estate ads.
The work has resulted in an entirely new approach to collect spatial data, and it posits that real estate data may be considered as an uncovered type of volunteered geoinformation.&lt;/p&gt;
&lt;p&gt;Her thesis was also &lt;a href="https://ual.sg/post/2022/11/24/new-paper-mining-real-estate-ads-and-property-transactions-for-building-and-amenity-data-acquisition/"&gt;published as a paper in Urban Informatics&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-approximate-locations-of-unmapped-fitness-centres-in-singapore-which-have-been-discovered-by-xinyus-approach-of-analysing-real-estate-information"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Approximate locations of unmapped fitness centres in Singapore, which have been discovered by Xinyu&amp;#39;s approach of analysing real estate information." srcset="
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/fc_hu_f35801a68affee4f.webp 400w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/fc_hu_ea356b9ebc47bd1f.webp 760w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/fc_hu_8d4b18040d22394f.webp 1200w"
src="https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/fc_hu_f35801a68affee4f.webp"
width="755"
height="531"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Approximate locations of unmapped fitness centres in Singapore, which have been discovered by Xinyu&amp;rsquo;s approach of analysing real estate information.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="3d-building-reconstruction-with-sparse-street-view-images-using-deep-learning-techniques"&gt;3D building reconstruction with sparse street view images using deep learning techniques&lt;/h3&gt;
&lt;p&gt;Street view imagery has been quite an interesting and relatively new source of urban data (e.g. read &lt;a href="https://ual.sg/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/"&gt;our recent review paper&lt;/a&gt;).
However, it has not been used much for generating 3D city models.
Further, because of vegetation and other obstacles, sometimes only one clear image of a particular building is available, inhibiting standard photogrammetric and other approaches.
Therefore, new methods have to be developed for 3D building model reconstruction in such scenarios.&lt;/p&gt;
&lt;p&gt;In her ambitious graduation project, &lt;a href="https://ual.sg/author/hui-en-pang/"&gt;Hui En Pang&lt;/a&gt; has developed a method to reconstruct 3D building models from single street view images.
Following the completion of this successful research, at the intersection of computer vision, 3D modelling and geographic information science, Hui En has been awarded an MSc in Applied GIS.&lt;/p&gt;
&lt;p&gt;The paper on her research was &lt;a href="https://ual.sg/post/2022/06/17/new-paper-3d-building-reconstruction-from-single-street-view-images-using-deep-learning/"&gt;published in JAG&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-reconstruction-of-3d-models-from-single-images-based-on-hui-ens-method"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Reconstruction of 3D models from single images, based on Hui En&amp;#39;s method." srcset="
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/3ddl_hu_c130b76e6f0ade71.webp 400w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/3ddl_hu_378e71aa0ebe4bd8.webp 760w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/3ddl_hu_34bbf668cca38284.webp 1200w"
src="https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/3ddl_hu_c130b76e6f0ade71.webp"
width="760"
height="304"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Reconstruction of 3D models from single images, based on Hui En&amp;rsquo;s method.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="usability-of-street-view-imagery-in-assessing-bikeability"&gt;Usability of street view imagery in assessing bikeability&lt;/h3&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt; graduated from the Master of Urban Planning with an interesting research on a new application of computer vision and street view imagery: assessing bikeability.
Koichi&amp;rsquo;s excellent work has demonstrated the value of these technologies and datasets for virtual audits gauging the cycling appeal, widely and at a high resolution.&lt;/p&gt;
&lt;p&gt;His thesis has been condensed into a paper &amp;ndash; &lt;a href="https://ual.sg/post/2021/09/20/new-paper-assessing-bikeability-with-street-view-imagery-and-computer-vision/"&gt;it has been published in TRC&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-bikeability-maps-in-singapore-and-tokyo-generated-with-koichis-automated-assessment"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Bikeability maps in Singapore and Tokyo, generated with Koichi&amp;#39;s automated assessment." srcset="
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/bikeability-maps_hu_fccb1fa5f16e94f4.webp 400w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/bikeability-maps_hu_6a56a12472684ab8.webp 760w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/bikeability-maps_hu_52238ea82bf34ad0.webp 1200w"
src="https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/bikeability-maps_hu_fccb1fa5f16e94f4.webp"
width="760"
height="589"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Bikeability maps in Singapore and Tokyo, generated with Koichi&amp;rsquo;s automated assessment.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="classification-of-urban-morphology-with-deep-learning-a-case-study-of-its-application-on-urban-vitality-prediction"&gt;Classification of urban morphology with deep learning: A case study of its application on urban vitality prediction&lt;/h3&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/wangyang-chen/"&gt;Wangyang Chen&lt;/a&gt; graduated with a master degree from our MUP programme.
He has done a great job on spearheading a new application of deep learning in urban analytics: characterising urban morphology and linking it to urban vibrancy.
The code he developed has been &lt;a href="https://github.com/ualsg/Road-Network-Classification" target="_blank" rel="noopener"&gt;released as open-source software&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Further, the paper based on his research was &lt;a href="https://ual.sg/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/"&gt;published in CEUS&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-categorical-maps-of-the-urban-form-at-a-high-spatial-resolution-determined-with-wangyangs-deep-learning-approach"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Categorical maps of the urban form at a high spatial resolution, determined with Wangyang&amp;#39;s deep learning approach." srcset="
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/um_hu_a8f0e940dee5ca70.webp 400w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/um_hu_f6ceab1dff6c7469.webp 760w,
/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/um_hu_e2ba7957098d9382.webp 1200w"
src="https://ual.sg/post/2021/09/01/graduation-projects-completed-at-our-lab-in-2021/um_hu_a8f0e940dee5ca70.webp"
width="760"
height="230"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Categorical maps of the urban form at a high spatial resolution, determined with Wangyang&amp;rsquo;s deep learning approach.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Congratulations to everyone on the awesome job and your degrees, well deserved &amp;#x1f393; &amp;#x1f44f;.
We wish you all the best in your future career steps.&lt;/p&gt;</description></item><item><title>Welcome to our new researchers</title><link>https://ual.sg/post/2021/08/31/welcome-to-our-new-researchers/</link><pubDate>Tue, 31 Aug 2021 15:10:28 +0800</pubDate><guid>https://ual.sg/post/2021/08/31/welcome-to-our-new-researchers/</guid><description>
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/08/31/welcome-to-our-new-researchers/featured_hu_f3fb21fe02b5b169.webp 400w,
/post/2021/08/31/welcome-to-our-new-researchers/featured_hu_f064008f2426c6e5.webp 760w,
/post/2021/08/31/welcome-to-our-new-researchers/featured_hu_d340c72ebea32542.webp 1200w"
src="https://ual.sg/post/2021/08/31/welcome-to-our-new-researchers/featured_hu_f3fb21fe02b5b169.webp"
width="760"
height="447"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;We are pleased to introduce five full-time researchers in our group &amp;#x1f44b;:
&lt;a href="https://ual.sg/author/binyu-lei/"&gt;Binyu Lei&lt;/a&gt;,
&lt;a href="https://ual.sg/author/leon-gaw/"&gt;Leon Gaw&lt;/a&gt;,
&lt;a href="https://ual.sg/author/yujun-hou/"&gt;Yujun Hou&lt;/a&gt;,
&lt;a href="https://ual.sg/author/winston-yap/"&gt;Winston Yap&lt;/a&gt;, and
&lt;a href="https://ual.sg/author/mengbi-ye/"&gt;Mengbi Ye&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We are excited to have you as part of our team, and we wish you success in your new roles. &amp;#x263a;&amp;#xfe0f;&lt;/p&gt;</description></item><item><title>New paper: Classification of Urban Morphology with Deep Learning</title><link>https://ual.sg/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/</link><pubDate>Sun, 22 Aug 2021 09:55:16 +0800</pubDate><guid>https://ual.sg/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/</guid><description>&lt;p&gt;We have a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Chen W, Wu AN, Biljecki F (2021): Classification of Urban Morphology with Deep Learning: Application on Urban Vitality. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 90: 101706. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2021.101706" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2021.101706&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-ceus-dl-morphology/2021-ceus-dl-morphology.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;In this research, an entirely new approach to characterise the urban form is developed.
The resulting data, derived at a high resolution (the example of the classification is available in the header image above), is then used to study its association with vibrancy in 9 cities.&lt;/p&gt;
&lt;p&gt;Congratulations to &lt;a href="https://ual.sg/author/wangyang-chen/"&gt;Wangyang Chen&lt;/a&gt;, our Master of Urban Planning graduate, on the great job and his first paper! &amp;#x1f64c; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;The code supporting this research has been &lt;a href="https://github.com/ualsg/Road-Network-Classification" target="_blank" rel="noopener"&gt;released as open-source software&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;CEUS is a top 1% journal in its discipline according to Scopus.&lt;/p&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Characterizing urban morphology is important in urban analytics&lt;/li&gt;
&lt;li&gt;The use of deep learning has not been investigated for this purpose&lt;/li&gt;
&lt;li&gt;We develop an approach to automatically classify urban forms from street networks&lt;/li&gt;
&lt;li&gt;The value of the work is demonstrated on an application on urban vitality&lt;/li&gt;
&lt;li&gt;Our deep learning approach reveals additional insights and enhances studies&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;There is a prevailing trend to study urban morphology quantitatively thanks to the growing accessibility to various forms of spatial big data, increasing computing power, and use cases benefiting from such information. The methods developed up to now measure urban morphology with numerical indices describing density, proportion, and mixture, but they do not directly represent morphological features from the human&amp;rsquo;s visual and intuitive perspective. We take the first step to bridge the gap by proposing a deep learning-based technique to automatically classify road networks into four classes on a visual basis. The method is implemented by generating an image of the street network (Colored Road Hierarchy Diagram), which we introduce in this paper, and classifying it using a deep convolutional neural network (ResNet-34). The model achieves an overall classification accuracy of 0.875. Nine cities around the world are selected as the study areas with their road networks acquired from OpenStreetMap. Latent subgroups among the cities are uncovered through clustering on the percentage of each road network category. In the subsequent part of the paper, we focus on the usability of such classification: we apply our method in a case study of urban vitality prediction. An advanced tree-based regression model (LightGBM) is for the first time designated to establish the relationship between morphological indices and vitality indicators. The effect of road network classification is found to be small but positively associated with urban vitality. This work expands the toolkit of quantitative urban morphology study with new techniques, supporting further studies in the future.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-ceus-dl-morphology/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-ceus-dl-morphology/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/page-one_hu_c2e8999d123bc4f0.webp 400w,
/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/page-one_hu_48ddfcfd147c1455.webp 760w,
/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/page-one_hu_ee2a30b2a7387537.webp 1200w"
src="https://ual.sg/post/2021/08/22/new-paper-classification-of-urban-morphology-with-deep-learning/page-one_hu_c2e8999d123bc4f0.webp"
width="561"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_ceus_dl_morphology&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Wangyang Chen and Abraham Noah Wu and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2021.101706}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{101706}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Classification of Urban Morphology with Deep Learning: Application on Urban Vitality}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;url&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{https://doi.org/10.1016/j.compenvurbsys.2021.101706}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{90}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="m"&gt;2021&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: review on street view imagery in urban analytics and GIS</title><link>https://ual.sg/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/</link><pubDate>Sat, 14 Aug 2021 15:00:16 +0800</pubDate><guid>https://ual.sg/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/</guid><description>&lt;p&gt;We have a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Ito K (2021): Street view imagery in urban analytics and GIS: A review. &lt;em&gt;Landscape and Urban Planning&lt;/em&gt; 215: 104217. &lt;a href="https://doi.org/10.1016/j.landurbplan.2021.104217" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2021.104217&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-land-svi-review/2021-land-svi-review.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Street view imagery has rapidly increased in prominence as to an important and omnipresent urban data source.
In this review, published open access, &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; and &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt; have combed through 619 papers to explore the versatile range of applications of street view imagery and current challenges, services, and opportunities.&lt;/p&gt;
&lt;p&gt;The review presents scores of interesting insights.
For example, the most comprehensive source of street view imagery &amp;ndash; Google Street View &amp;ndash; dominates and underpins 2/3 of the body of knowledge. It is followed by Baidu &amp;amp; Tencent. All these commercial services are increasingly restrictive, not guaranteeing easy access in the future. On the other hand, volunteered SVI &amp;ndash; Mapillary &amp;amp; KartaView provide an alternative but they’re not used frequently, due to scarce panoramas and heterogeneous quality/coverage. However, crowdsourced street view imagery has advantages too, e.g. permissive licence, opposing the restrictions of commercial counterparts.&lt;/p&gt;
&lt;p&gt;Landscape and Urban Planning is a top 1% journal in its discipline according to Scopus.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/5_hu_d2773b8f0046ed7b.webp 400w,
/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/5_hu_d0a8c3d8ebbdfcac.webp 760w,
/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/5_hu_1c2de87d13c66e9.webp 1200w"
src="https://ual.sg/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/5_hu_d2773b8f0046ed7b.webp"
width="760"
height="370"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Street-level imagery became ingrained as an important urban data source.&lt;/li&gt;
&lt;li&gt;Most comprehensive review on street view imagery in geospatial and urban studies.&lt;/li&gt;
&lt;li&gt;We have screened 619 papers to identify the state of the art, focusing on applications.&lt;/li&gt;
&lt;li&gt;250 studies are classified into 10 application domains and span dozens of use cases.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Street view imagery has rapidly ascended as an important data source for geospatial data collection and urban analytics, deriving insights and supporting informed decisions. Such surge has been mainly catalysed by the proliferation of large-scale imagery platforms, advances in computer vision and machine learning, and availability of computing resources. We screened more than 600 recent papers to provide a comprehensive systematic review of the state of the art of how street-level imagery is currently used in studies pertaining to the built environment. The main findings are that: (i) street view imagery is now clearly an entrenched component of urban analytics and GIScience; (ii) most of the research relies on data from Google Street View; and (iii) it is used across myriads of domains with numerous applications – ranging from analysing vegetation and transportation to health and socio-economic studies. A notable trend is crowdsourced street view imagery, facilitated by services such as Mapillary and KartaView, in some cases furthering geographical coverage and temporal granularity, at a permissive licence.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-land-svi-review/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-land-svi-review/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/page-one_hu_94f541cb6ac4bf59.webp 400w,
/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/page-one_hu_f0225c3adaeee2d4.webp 760w,
/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/page-one_hu_d71de276ef8b487b.webp 1200w"
src="https://ual.sg/post/2021/08/14/new-paper-review-on-street-view-imagery-in-urban-analytics-and-gis/page-one_hu_94f541cb6ac4bf59.webp"
width="571"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_land_svi_review&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Filip Biljecki and Koichi Ito}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2021.104217}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104217}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Street view imagery in urban analytics and GIS: A review}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;url&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{https://doi.org/10.1016/j.landurbplan.2021.104217}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{215}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="m"&gt;2021&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Roofpedia</title><link>https://ual.sg/post/2021/06/24/new-paper-roofpedia/</link><pubDate>Thu, 24 Jun 2021 17:00:16 +0800</pubDate><guid>https://ual.sg/post/2021/06/24/new-paper-roofpedia/</guid><description>&lt;p&gt;We have a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Wu AN, Biljecki F (2021): Roofpedia: Automatic mapping of green and solar roofs for an open roofscape registry and evaluation of urban sustainability. &lt;em&gt;Landscape and Urban Planning&lt;/em&gt; 214: 104167. &lt;a href="https://doi.org/10.1016/j.landurbplan.2021.104167" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.landurbplan.2021.104167&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-land-roofpedia/2021-land-roofpedia.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;In this study, published in &lt;em&gt;Landscape and Urban Planning&lt;/em&gt;, we have mapped the content of 1 million rooftops in 17 cities around the world, detecting solar panels and greenery, to understand their prevalence in cities.
Through Roofpedia, a research prototype developed thanks to the excellent work of &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt; (his first paper, in a top journal no less &amp;#x1f389; &amp;#x1f44f;), we can obtain insights that may aid in uncovering the pattern of sustainable rooftops, complement studies on their potential, evaluate the effectiveness of existing incentives, verify the use of subsidies and fulfilment of climate pledges, estimate carbon offset capacities of cities, and ultimately support better policies and strategies to increase the adoption of instruments for sustainable development of urban areas.&lt;/p&gt;
&lt;p&gt;Landscape and Urban Planning is a top 1% journal in its discipline according to Scopus.&lt;/p&gt;
&lt;figure id="figure-illustration-describing-the-project-and-its-organisation-in-three-parts"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Illustration describing the project and its organisation in three parts." srcset="
/post/2021/06/24/new-paper-roofpedia/grabs_hu_e3d3e73eb8f9e8bf.webp 400w,
/post/2021/06/24/new-paper-roofpedia/grabs_hu_772892022b5e7e7.webp 760w,
/post/2021/06/24/new-paper-roofpedia/grabs_hu_6de6caebce33eeea.webp 1200w"
src="https://ual.sg/post/2021/06/24/new-paper-roofpedia/grabs_hu_e3d3e73eb8f9e8bf.webp"
width="760"
height="568"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Illustration describing the project and its organisation in three parts.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;There is a lack of open data on urban rooftop typology and current use of roofs.&lt;/li&gt;
&lt;li&gt;A deep learning and GIS workflow to map and quantify green and solar roofs.&lt;/li&gt;
&lt;li&gt;A generated dataset that covers 17 cities, scalable to include more locations.&lt;/li&gt;
&lt;li&gt;An index to benchmark the proliferation of green and solar roofs in cities.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Sustainable roofs, such as those with greenery and photovoltaic panels, contribute to the roadmap for reducing the carbon footprint of cities. However, research on sustainable urban roofscapes is rather focused on their potential and it is hindered by the scarcity of data, limiting our understanding of their current content, spatial distribution, and temporal evolution. To tackle this issue, we introduce Roofpedia, a set of three contributions: (i) automatic mapping of relevant urban roof typology from satellite imagery; (ii) an open roof registry mapping the spatial distribution and area of solar and green roofs of more than one million buildings across 17 cities; and (iii) the Roofpedia Index, a derivative of the registry, to benchmark the cities by the extent of sustainable roofscape in term of solar and green roof penetration. This project, partly inspired by its street greenery counterpart ‘Treepedia’, is made possible by a multi-step pipeline that combines deep learning and geospatial techniques, demonstrating the feasibility of an automated methodology that generalises successfully across cities with an accuracy of detecting sustainable roofs of up to 100% in some cities. We offer our results as an interactive map and open dataset so that our work could aid researchers, local governments, and the public to uncover the pattern of sustainable rooftops across cities, track and monitor the current use of rooftops, complement studies on their potential, evaluate the effectiveness of existing incentives, verify the use of subsidies and fulfilment of climate pledges, estimate carbon offset capacities of cities, and ultimately support better policies and strategies to increase the adoption of instruments contributing to the sustainable development of cities.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-land-roofpedia/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-land-roofpedia/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/06/24/new-paper-roofpedia/page-one_hu_b59a769c1cd408fd.webp 400w,
/post/2021/06/24/new-paper-roofpedia/page-one_hu_b931567604209703.webp 760w,
/post/2021/06/24/new-paper-roofpedia/page-one_hu_24dcc1e8b7485b17.webp 1200w"
src="https://ual.sg/post/2021/06/24/new-paper-roofpedia/page-one_hu_b59a769c1cd408fd.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_land_roofpedia&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Abraham Noah Wu and Filip Biljecki}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.landurbplan.2021.104167}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Landscape and Urban Planning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{104167}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Roofpedia: Automatic mapping of green and solar roofs for an open roofscape registry and evaluation of urban sustainability}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;url&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{https://doi.org/10.1016%2Fj.landurbplan.2021.104167}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{214}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="m"&gt;2021&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Impact of the pandemic on bus ridership in Singapore</title><link>https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/</link><pubDate>Mon, 31 May 2021 08:43:28 +0800</pubDate><guid>https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/</guid><description>&lt;p&gt;Each year, &lt;a href="https://www.sde.nus.edu.sg/arch/programmes/master-of-urban-planning/curriculum/" target="_blank" rel="noopener"&gt;Master of Urban Planning&lt;/a&gt; students taking the module &lt;a href="https://www.sde.nus.edu.sg/arch/programme_article/dep5111-planning-technologies/" target="_blank" rel="noopener"&gt;DEP5111 Planning Technologies&lt;/a&gt; carry out group projects to demonstrate the attained data science skills and apply them in the urban context.&lt;/p&gt;
&lt;p&gt;The article below features one of the projects completed in this semester by Chen Bingling, Liang Xiucheng, Liu Lina, and Wen Ruiwen.&lt;/p&gt;
&lt;p&gt;This final group project was finished in early April 2021, focusing on the comparative analysis of public transportation (bus) passenger volumes just before the onset of the pandemic and the introduction of the safety measures (January 2020) and a year later, several months after many of the restrictions were progressively lifted.&lt;/p&gt;
&lt;p&gt;Each group project, including this one, was carried out entirely using open data and open-source software.&lt;/p&gt;
&lt;p&gt;We thank all the students in the class for their interesting projects.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The COVID-19 pandemic greatly impacted cities around the world in various aspects, e.g. resulting in the decrease of public transportation ridership. This study investigates the influence of COVID-19 on bus travel behavior across different land uses and planning areas in Singapore, and analyzes the bus stop recovery rate using an evaluation model. According to the data available at the time of the analysis, 92% of bus stops have not yet resumed to the stage of January 2020 in terms of passenger volume. The study reveals some notable results, e.g. after the strictest measures were lifted in June 2020, Marina South and the Southern Islands exhibited a sharp rise in traffic in July 2020. Nevertheless, in comparison with other planning areas, these locations still have a low recovery rate presumably partly due to the loss of international visitors. In relative terms, the residential areas tend to have a better recovery performance. However, the institutional and business related land uses are still under the expected lines, partially because a large portion of the population has transitioned to working and learning from home.
The results also suggest an inverse relationship between the share of residents living in private housing and the recovery rate of bus passenger volume at the scale of a planning area.&lt;/p&gt;
&lt;figure id="figure-photo-by-shawnanggghttpsunsplashcomshawnanggg-on-unsplashhttpsunsplashcomsphotoscovid-singapore-bus"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Photo by [shawnanggg](https://unsplash.com/@shawnanggg) on [Unsplash](https://unsplash.com/s/photos/covid-singapore-bus)." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/shawnanggg-UmxIsHNZvTs-unsplash_hu_30234df6cf449938.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/shawnanggg-UmxIsHNZvTs-unsplash_hu_185a3393101283e2.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/shawnanggg-UmxIsHNZvTs-unsplash_hu_86add02be1c00892.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/shawnanggg-UmxIsHNZvTs-unsplash_hu_30234df6cf449938.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Photo by &lt;a href="https://unsplash.com/@shawnanggg" target="_blank" rel="noopener"&gt;shawnanggg&lt;/a&gt; on &lt;a href="https://unsplash.com/s/photos/covid-singapore-bus" target="_blank" rel="noopener"&gt;Unsplash&lt;/a&gt;.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="background"&gt;Background&lt;/h3&gt;
&lt;p&gt;Starting in April 2020, &lt;a href="https://en.wikipedia.org/wiki/2020%e2%80%9321_Singapore_circuit_breaker_measures" target="_blank" rel="noopener"&gt;circuit breaker measures&lt;/a&gt; have been implemented in preventing further spread of COVID-19. The measures were beneficial and successful, but as everywhere around the world, they had an &lt;a href="https://en.wikipedia.org/wiki/COVID-19_pandemic_in_Singapore#Transportation" target="_blank" rel="noopener"&gt;impact on transportation&lt;/a&gt;. The Land Transport Authority reported that average daily ridership for buses and trains fell by 34.5% – an 11-year low&lt;sup id="fnref:1"&gt;&lt;a href="#fn:1" class="footnote-ref" role="doc-noteref"&gt;1&lt;/a&gt;&lt;/sup&gt;.
The city-state was not alone in experiencing plummeting ridership, e.g. in Washington DC, metrorail, bus ridership declined by 75%&lt;sup id="fnref:2"&gt;&lt;a href="#fn:2" class="footnote-ref" role="doc-noteref"&gt;2&lt;/a&gt;&lt;/sup&gt;; in London, the usage of public transportation dropped by 40%&lt;sup id="fnref:3"&gt;&lt;a href="#fn:3" class="footnote-ref" role="doc-noteref"&gt;3&lt;/a&gt;&lt;/sup&gt;.&lt;/p&gt;
&lt;p&gt;The aim of this study is to understand the changes of bus ridership over the period of COVID-19 and to what extent has people’s movement recovered from the pandemic.
The analysis focuses on Singapore’s planning areas and land use, and it also analyzes relationships with demographics.&lt;/p&gt;
&lt;h3 id="data-and-method"&gt;Data and Method&lt;/h3&gt;
&lt;p&gt;This study is conducted by investigating the passenger volume changes by months in 2020 and other factors such as land use and the characteristics of planning areas. Based on the comparison of the passenger volume and flow direction in January 2021 with the same period of 2020, an evaluation model of passenger volume recovery for each bus stop is built and implemented to identify the changing behavior of bus ridership. It also aims to discuss the implications for future planning of bus stops and transport networks.&lt;/p&gt;
&lt;p&gt;There are two key data sources used in this analysis: (i) origin destination data from the &lt;a href="https://datamall.lta.gov.sg" target="_blank" rel="noopener"&gt;Land Transport Authority&lt;/a&gt;; and (ii) Land Use and Planning area boundaries according to the Master Plan of the &lt;a href="https://data.gov.sg/search?organization=urban-redevelopment-authority" target="_blank" rel="noopener"&gt;Urban Redevelopment Authority&lt;/a&gt;.
Further, the demographic data by the &lt;a href="https://www.singstat.gov.sg" target="_blank" rel="noopener"&gt;Singapore Department of Statistics&lt;/a&gt; has been used to attempt to associate the change in public transport ridership with demographics.&lt;/p&gt;
&lt;p&gt;In brief, we first calculate and aggregate the passenger volumes by land use and planning area according to a 50m buffer around each bus stop, and observe the general trends.
Afterwards, we calculate the rate of change in total passenger volume (sum of tap-in and tap-out) related to the previous month and its change rate.&lt;/p&gt;
&lt;h3 id="results"&gt;Results&lt;/h3&gt;
&lt;h4 id="planning-area-and-land-use-type-change-rate-by-month"&gt;Planning area and land use type change rate by month&lt;/h4&gt;
&lt;figure id="figure-fig-1"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Rate of change of bus ridership by planning area." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/1_hu_686ab92cf003608c.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/1_hu_702c1e411dc54173.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/1_hu_b364f6eda8ccc415.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/1_hu_686ab92cf003608c.webp"
width="760"
height="602"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Rate of change of bus ridership by planning area.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;In Figure &lt;a href="#fig-1"&gt;1&lt;/a&gt;, we summarize the rate of change of passenger volume in each planning area and visualize with heatmap.
There is a visible change during and after the period April-May 2020 when the most stringent measures were in place.
Marina South and Southern Islands exhibit explosive growth in volumes in July 2020.&lt;/p&gt;
&lt;p&gt;Using the same method, we calculate the change rate of land use.&lt;/p&gt;
&lt;figure id="figure-fig-2"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Rate of change of bus ridership by land use." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/2_hu_b5722459c8e92709.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/2_hu_4fc66be455955861.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/2_hu_12e55edd74e4dce2.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/2_hu_b5722459c8e92709.webp"
width="760"
height="600"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Rate of change of bus ridership by land use.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;In Figure &lt;a href="#fig-2"&gt;2&lt;/a&gt;, bus stops associated with the land use of Beach Area (according to the URA Master Plan) have taken the lead in a positive upward trend of total passenger volume in May 2020. These are mostly locations close to residential areas (e.g. East Coast Park), and it can be assumed that residents preferred close seaside parks for leisure activities during Phase I.&lt;/p&gt;
&lt;p&gt;In the meantime, the Educational Institution land use exhibited the most significant increase while the Port/Airport only showed a slight increase in June 2020. This phenomenon can be explained by the large number of students resuming class while many jobs are still in the work from home mode, and obviously because of the severely restricted international travel.&lt;/p&gt;
&lt;h4 id="period-over-period-comparison-of-total-passenger-volume-and-traveling-direction"&gt;Period-over-period comparison of total passenger volume and traveling direction&lt;/h4&gt;
&lt;p&gt;We worked on understanding how the mutual flow of passengers has changed between land uses using the Sankey diagram. Considering that land use types with small absolute values of flow rate will show thin lines that are difficult to distinguish, 11 land-use types with large passenger flow rates are chosen here for graphical representation, for each period separately.&lt;/p&gt;
&lt;figure id="figure-fig-3"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Flow of Passenger Traffic by land use type in January 2020. Data: LTA &amp; URA, 2021." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/3_hu_a99d0e26e09e65e5.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/3_hu_3b994fa7d4548f4.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/3_hu_bb5869328c98d2b8.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/3_hu_a99d0e26e09e65e5.webp"
width="700"
height="577"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Flow of Passenger Traffic by land use type in January 2020. Data: LTA &amp;amp; URA, 2021.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;figure id="figure-fig-4"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Flow of Passenger Traffic by land use type in January 2021. Data: LTA &amp; URA, 2021." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/4_hu_db959d823594eda4.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/4_hu_725e6193de2b78db.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/4_hu_a6636ee361be6f30.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/4_hu_db959d823594eda4.webp"
width="708"
height="582"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Flow of Passenger Traffic by land use type in January 2021. Data: LTA &amp;amp; URA, 2021.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Comparing Figure &lt;a href="#fig-3"&gt;3&lt;/a&gt; and Figure &lt;a href="#fig-4"&gt;4&lt;/a&gt;, we can see that the volume flowing from residential to transport facilities decreases, the volume flowing from park to business2 increases significantly, and the flow from commercial to business decreases remarkably.&lt;/p&gt;
&lt;p&gt;Next, we compare the total passenger volumes by planning area of Jan 2020 and Jan 2021.&lt;/p&gt;
&lt;figure id="figure-fig-5"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Changes of total volume by planning area." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/5_hu_64af7f0891288a69.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/5_hu_f9c56bad84fce0be.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/5_hu_2a9d99ce23a77aac.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/5_hu_64af7f0891288a69.webp"
width="760"
height="601"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Changes of total volume by planning area.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Figure &lt;a href="#fig-5"&gt;5&lt;/a&gt; suggests that Yishun and Bukit Batok, which are predominantly residential areas, have a smaller difference in total trip value than Woodlands and Bukit Panjang in comparison to the same period last year, and have shown better recovery rates.
Meanwhile, Kallang, Jurong West, Jurong East, Houguang, and Geylang, etc. show a similar trend. It is assumed that the demographic structure of the area may have some influence on the degree of traffic volume recovery.&lt;/p&gt;
&lt;h4 id="recovery-performance"&gt;Recovery performance&lt;/h4&gt;
&lt;h5 id="recovery-performance-analysis-by-bus-stop-land-use-and-town"&gt;Recovery performance analysis by bus stop, land use and town&lt;/h5&gt;
&lt;p&gt;The recovery performance of bus stops is defined by the ratio of the total passenger volume in January 2021 to the total passenger volume in January 2020. Firstly, we calculated the recovery rate of each bus stop and observed the overall distribution through density graph.&lt;/p&gt;
&lt;figure id="figure-fig-6"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Frequency distribution of different recovery rates of bus stops in Singapore." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/6_hu_8fa765158e371f2b.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/6_hu_8d3ec2dc0f810034.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/6_hu_241f4304719ba99f.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/6_hu_8fa765158e371f2b.webp"
width="760"
height="349"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Frequency distribution of different recovery rates of bus stops in Singapore.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Figure &lt;a href="#fig-6"&gt;6&lt;/a&gt; indicates that the recovery rate of 92% bus stops is below one, meaning that by Jan 2021 their total tap in and tap out volume has not recovered to the levels in Jan 2020, before the pandemic. The recovery rate of most bus stops concentrates between 0.625 and 0.875, and some bus stops experienced more severe loss in passenger volumes, with their recovery rate below 0.5. Dividing the total trips of each bus stop into different land use and planning areas according to their land area percentage of the 50m buffer area, we are able to calculate the recovery rates of different land use and planning areas.&lt;/p&gt;
&lt;figure id="figure-fig-7"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Recovery rate of bus riding behaviour by land use." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/7_hu_ee090cabe2914970.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/7_hu_8d37ae8e59b35b21.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/7_hu_54c23a119ac751bf.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/7_hu_ee090cabe2914970.webp"
width="760"
height="473"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Recovery rate of bus riding behaviour by land use.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;In Figure &lt;a href="#fig-7"&gt;7&lt;/a&gt;, the recovery rates of all land uses, except for beach area are below 0.625, and it is clear that residential as well as industrial related land use tend to have better recovery performance, while some institutional or tourism related land uses, such as airport or hotel, unsurprisingly still have relatively low recovery rates. The recovery rates of educational institutions, commercial and business parks are not as high as expected, which may be explained by the work from home guidelines. These institutions and their staff have more or less adapted to remote working or learning mode, so they tend to adopt a cautious attitude even if the pandemic has eased.&lt;/p&gt;
&lt;figure id="figure-fig-8"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Recovery rate of bus riding behaviour by planning area." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/8_hu_2ac3ad5a93cf94de.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/8_hu_5639be1db0579d5f.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/8_hu_96b22016a98d1ee9.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/8_hu_2ac3ad5a93cf94de.webp"
width="760"
height="475"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Recovery rate of bus riding behaviour by planning area.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;figure id="figure-fig-9"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Recovery rate of bus riding behaviour by planning area. Planning areas without bus stops are excluded from the spatial dataset." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/9_hu_60bc0e77d98f251d.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/9_hu_35e8f04b8464e968.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/9_hu_c064408a665ccbc8.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/9_hu_60bc0e77d98f251d.webp"
width="760"
height="461"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Recovery rate of bus riding behaviour by planning area. Planning areas without bus stops are excluded from the spatial dataset.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Moving to planning areas, it is indicated in Figure &lt;a href="#fig-8"&gt;8&lt;/a&gt; and Figure &lt;a href="#fig-9"&gt;9&lt;/a&gt; that there are stark differences in the recovery rate of different planning areas, ranging from 0.283 to 0.939. Hit by the decrease of tourists, Southern Islands (i.e. the planning area in which Sentosa is located) has the lowest recovery rate, while new towns with a large percentage of residential land use, such as Punggol, Paya Lebar, and Yishun recovered faster than other towns.&lt;/p&gt;
&lt;p&gt;Besides land use, we analyze whether socio-demographic features may explain the variances of recovery rates across different towns.
We joined the data of age (share of seniors), gender ratio (share of female population), population density, as well as the percentage of residents living in private housing, and computed correlations.&lt;/p&gt;
&lt;figure id="figure-fig-10"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Recovery rate, senior population share, population density, private housing rate, and gender ratio." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/10_hu_6496344fe00f5fce.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/10_hu_6b4c9a255a9b04f4.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/10_hu_196bc0db3763e0fc.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/10_hu_6496344fe00f5fce.webp"
width="760"
height="683"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Recovery rate, senior population share, population density, private housing rate, and gender ratio.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The results in Figure &lt;a href="#fig-10"&gt;10&lt;/a&gt; suggest that population density has the most significant impact on recovery rate, and the more populated a town is, the faster it may recover from the pandemic in terms of bus stop passenger volume. The percentage of residents living in private housing has a moderate negative correlation with recovery rate, possibly because other private transport modes are more accessible to the residents in planning areas with more private dwellings.&lt;/p&gt;
&lt;h5 id="recovery-performance-evaluation-and-adjustment-recommendation-of-bus-stops"&gt;Recovery performance evaluation and adjustment recommendation of bus stops&lt;/h5&gt;
&lt;p&gt;Combining the results of recovery rates of planning area, land use and each bus stop, as well as the their mutual flow of passengers, we are able to conduct a bus stop recovery performance evaluation.
Each bus stop was given a categorical score (from A to E) to indicate its overall recovery rank based on the aforementioned aspects.
We visualized the bus stops and their ranks in the map and observed the spatial relationships.&lt;/p&gt;
&lt;figure id="figure-fig-13"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Recovery rank of bus stops for a part of Singapore." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/13_hu_d78f8c69894efd2a.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/13_hu_7988a20bf038fb91.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/13_hu_f6a0fc934ada6b09.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/13_hu_d78f8c69894efd2a.webp"
width="677"
height="484"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Recovery rank of bus stops for a part of Singapore.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;figure id="figure-fig-14"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Recovery rank of bus stops (only D and E)." srcset="
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/14_hu_328a7a720fd61aea.webp 400w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/14_hu_2706207e8e906181.webp 760w,
/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/14_hu_278a76288951299b.webp 1200w"
src="https://ual.sg/post/2021/05/31/impact-of-the-pandemic-on-bus-ridership-in-singapore/14_hu_328a7a720fd61aea.webp"
width="760"
height="485"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption data-pre="Figure&amp;nbsp;" data-post=":&amp;nbsp;" class="numbered"&gt;
Recovery rank of bus stops (only D and E).
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Figures &lt;a href="#fig-13"&gt;11&lt;/a&gt; &amp;amp; &lt;a href="#fig-14"&gt;12&lt;/a&gt; show that bus stops of rank A or B are more equally distributed, while bus stops of rank C or D seem to concentrate in areas such as CBD, Bukit Timah and Tuas. Areas with a notable share of bus stop of A and B ranks could well preserve and maintain the existing bus stops, while areas with high concentration of bus stops of D and E rank could be considered to combine some of the bus stops or adjust their locations until the impacts of the pandemic are relieved.&lt;/p&gt;
&lt;h3 id="conclusions"&gt;Conclusions&lt;/h3&gt;
&lt;p&gt;In this exercise, we observed how the pandemic has influenced public transit behaviours in terms of both robustness and resilience. As almost everywhere around the globe, measures introduced to combat the pandemic have greatly influenced people’s transit travel behavior, as shown by the rate of change in planning area and land use. In terms of robustness, we found that certain land use and planning areas experienced the most severe strikes in passenger volumes during May 2020 and July 2020, when the pandemic and the restriction reached their peaks. The locations where people tend to travel to by bus have also changed, as shown by the change in the ranking of destination land use types.&lt;/p&gt;
&lt;p&gt;In terms of resilience, we found out that the total bus trips have not yet returned to the pre-pandemic level in all planning areas as well as land uses, in spite of some differences in their recovery rate. Therefore, as an urban planning exercise, we conducted a recovery performance evaluation on bus stops.&lt;/p&gt;
&lt;h3 id="limitations"&gt;Limitations&lt;/h3&gt;
&lt;p&gt;This analysis is not without limitations.
The most important ones are:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;We assumed that all changes in bus trips were the effect of the pandemic and did not fully consider other factors, such as the bus operator’s own adjustments to stops and routes, and other factors that are simply unrelated to the pandemic and measures.&lt;/li&gt;
&lt;li&gt;We gave the same weight to all site types, and the passenger volume to each site was assigned with the proportion of the footprint of the site type within the service radius, without taking into account the relatively higher possibility of land use like commercial and residential as travel destinations in real life, so the results may have some deviations from the actual outcomes.&lt;/li&gt;
&lt;li&gt;In the correlation analysis of the factors influencing the recovery rate in the region, the amount of data that can be used is limited and the sample size is relatively small, so the correlation coefficients may not be entirely indicative.&lt;/li&gt;
&lt;/ol&gt;
&lt;h3 id="acknowledgements"&gt;Acknowledgements&lt;/h3&gt;
&lt;p&gt;This student project was carried out as part of module &lt;a href="https://www.sde.nus.edu.sg/arch/programme_article/dep5111-planning-technologies/" target="_blank" rel="noopener"&gt;DEP5111 Planning Technologies&lt;/a&gt; in the &lt;a href="https://www.sde.nus.edu.sg/arch/programmes/master-of-urban-planning/curriculum/" target="_blank" rel="noopener"&gt;Master of Urban Planning&lt;/a&gt;, which is conducted by the &lt;a href="https://ual.sg/"&gt;NUS Urban Analytics Lab&lt;/a&gt;.
The open data sources by LTA, URA and SingStat used in this analysis are gratefully acknowledged.&lt;/p&gt;
&lt;h3 id="further-reading"&gt;Further reading&lt;/h3&gt;
&lt;p&gt;If you find this article of interest, have a look also at &lt;a href="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/"&gt;our April 2020 analysis using carpark data&lt;/a&gt;.&lt;/p&gt;
&lt;div class="footnotes" role="doc-endnotes"&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id="fn:1"&gt;
&lt;p&gt;Tan, C. (10 February 2021). &lt;a href="https://www.straitstimes.com/singapore/transport/bus-train-ridership-in-singapore-falls-to-11-year-low-amid-covid-19-pandemic" target="_blank" rel="noopener"&gt;Bus, train ridership in Singapore falls to 11-year low amid Covid-19 pandemic&lt;/a&gt;. The Straits Times.&amp;#160;&lt;a href="#fnref:1" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:2"&gt;
&lt;p&gt;WMATA. &lt;a href="https://www.wmata.com/service/status/details/COVID-19.cfm" target="_blank" rel="noopener"&gt;Metro and Covid-19: Steps we’ve taken&lt;/a&gt;. 2020 [accessed on 2 April 2020].&amp;#160;&lt;a href="#fnref:2" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:3"&gt;
&lt;p&gt;Harrabin, B.R. (25 April 2020). &lt;a href="https://www.bbc.com/news/business-52414376" target="_blank" rel="noopener"&gt;Coronavirus: Transport usage will change after lockdown&lt;/a&gt;. BBC News.&amp;#160;&lt;a href="#fnref:3" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;</description></item><item><title>NUS Urban Analytics Lab scales research globally with AWS</title><link>https://ual.sg/post/2021/05/23/nus-urban-analytics-lab-scales-research-globally-with-aws/</link><pubDate>Sun, 23 May 2021 06:43:28 +0800</pubDate><guid>https://ual.sg/post/2021/05/23/nus-urban-analytics-lab-scales-research-globally-with-aws/</guid><description>&lt;p&gt;Our ongoing research on developing a toolkit and dataset of urban morphology was featured as a &lt;a href="https://aws.amazon.com/blogs/publicsector/nus-urban-analytics-lab-scales-research-globally-aws/" target="_blank" rel="noopener"&gt;news item&lt;/a&gt; by Amazon at their &lt;a href="https://aws.amazon.com/blogs/publicsector/" target="_blank" rel="noopener"&gt;AWS Public Sector Blog&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We thank AWS for their support and for writing about our project.&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://aws.amazon.com/blogs/publicsector/nus-urban-analytics-lab-scales-research-globally-aws/" target="_blank" rel="noopener"&gt;full release&lt;/a&gt; is copied below.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;The Urban Analytics Lab at the National University of Singapore (NUS) spearheads research in geospatial data analysis and three-dimensional (3D) city modelling. The lab’s work underpins the development of smart cities and provides scientists, architects, urban planners, and real estate developers with data insights. These insights help parties make informed decisions about projects ranging from energy modelling to urban farming. To meet rising global demand for its data analytics and planning tools, Urban Analytics Lab turned to Amazon Web Services (AWS).&lt;/p&gt;
&lt;p&gt;Following its inception in 2019, Urban Analytics Lab used a single server on the NUS campus to power its research. However, the limited capabilities of the infrastructure meant that researchers were constrained to contain their projects or run them on a smaller scale. The research group selected AWS not only because it was on the NUS IT department’s pre-approved cloud vendor list, but because of its track record supporting academic research and providing generous cloud credits to help institutions spark innovation.&lt;/p&gt;
&lt;p&gt;AWS offered Urban Analytics Lab the global, on-demand scalability and availability it needed to expand its research. Researchers could adopt AWS services quickly to help them conduct research more efficiently.&lt;/p&gt;
&lt;p&gt;Today, Urban Analytics Lab uses AWS as the foundation for most of its investigations. Much of the lab’s research is based on OpenStreetMap, a crowdsourced geospatial dataset. Using Amazon Relational Database Service (Amazon RDS), researchers imported OpenStreetMap data for the entire world into a PostGIS database that they have upgraded with custom procedures for processing and analysis.&lt;/p&gt;
&lt;p&gt;Hosting the PostGIS database on Amazon RDS enables researchers to work on an unprecedented scale. They can conduct more thorough and in-depth analysis and apply it to almost any geographic location.&lt;/p&gt;
&lt;p&gt;Urban Analytics Lab also used Amazon RDS and Amazon Elastic Compute Cloud (Amazon EC2) to create a 10 TB geospatial warehouse from which it derives quantitative properties of buildings from urban areas all around the world, including densities, volumetric compactness, and complexity of urban landscapes. The resulting Global Building Morphological Indicator dataset, spanning 500 million buildings and counting, will provide the most consistent and comprehensive set of metrics yet for buildings and the built environment around the world. Such a dataset may be used by urban planners and energy and climate researchers to understand the urban fabric and interplay with phenomena such as emissions, population, and economic data.&lt;/p&gt;
&lt;p&gt;Scientists at the lab plan to apply machine learning (ML) to these indicators to predict the heights of individual buildings around the world, enhancing the input OpenStreetMap dataset, which is still largely confined to two-dimensional (2D) data. Adding height information will support further analyses and use cases, such as noise pollution estimations and wind flow analyses. Amazon EC2 and Amazon SageMaker will be used to build, train, and manage the models while the experiments are conducted, and Amazon Simple Storage Service (Amazon S3) will be used to store and share the intermediate, secondary, and final results.&lt;/p&gt;
&lt;p&gt;The scalability the research group achieved by using AWS means that Urban Analytics Lab no longer needs to invest in its own high-performance and high-capacity servers and can redeploy those funds to other areas of the organization. And, since AWS offers secure access free of internal VPN restrictions, research teams can work virtually anywhere—a key advantage particularly during the COVID-19 pandemic.&lt;/p&gt;
&lt;p&gt;With AWS, researchers can perform analysis and generate urban morphological data on a global scale, benefitting both its own research efforts and relevant studies in peripheral fields.&lt;/p&gt;
&lt;p&gt;Learn more about the cloud for research and the cloud for higher education.&lt;/p&gt;</description></item><item><title>Novel use of 3D geoinformation to identify urban farming sites</title><link>https://ual.sg/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/</link><pubDate>Sun, 18 Apr 2021 18:43:28 +0800</pubDate><guid>https://ual.sg/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/</guid><description>&lt;p&gt;&lt;a href="https://ual.sg/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/"&gt;Our research on developing a new application of 3D city models for urban farming estimations&lt;/a&gt; was featured as a &lt;a href="https://news.nus.edu.sg/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/" target="_blank" rel="noopener"&gt;news item&lt;/a&gt; by our university.&lt;/p&gt;
&lt;p&gt;Thanks to NUS for sharing our work! &amp;#x1f64f;&lt;/p&gt;
&lt;p&gt;The full release is copied below.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Urban farming has picked up in scale and sophistication globally in recent years. Several innovative urban farming approaches have been introduced in Singapore, such as rooftop farming, optimisation of land use, the introduction of more greenery into the built environment, and the 30 by 30 vision set by the Singapore Food Agency to target the production of 30 per cent of Singapore’s nutritional needs by 2030.&lt;/p&gt;
&lt;p&gt;However, the suitability of specific crops and locations for farming is variable, and conventional methods to assess farming potential involve field visits and time-consuming measurements.&lt;/p&gt;
&lt;p&gt;How can we assess the suitability of farming locations in land scarce Singapore quickly and accurately?&lt;/p&gt;
&lt;figure id="figure-spaces-such-as-rooftops-and-stairways-in-high-rise-buildings-may-unlock-new-sites-for-farming-in-land-scarce-singapore"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Spaces such as rooftops and stairways in high-rise buildings may unlock new sites for farming in land-scarce Singapore." srcset="
/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416collage_hu_cc56ea0f73cd640c.webp 400w,
/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416collage_hu_a159bc48fd40d6.webp 760w,
/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416collage_hu_4ddebc9fc6f1631c.webp 1200w"
src="https://ual.sg/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416collage_hu_cc56ea0f73cd640c.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Spaces such as rooftops and stairways in high-rise buildings may unlock new sites for farming in land-scarce Singapore.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h4 id="predicting-sunlight-conditions-to-determine-farming-sites-and-suitable-crops"&gt;Predicting sunlight conditions to determine farming sites and suitable crops&lt;/h4&gt;
&lt;p&gt;Led by Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, presidential young professor at &lt;a href="https://www.sde.nus.edu.sg/" target="_blank" rel="noopener"&gt;NUS Design and Environment&lt;/a&gt;, the study investigates the possibility of using three-dimensional (3D) city models and urban digital twins to assess the suitability of farming locations in high-rise buildings in terms of sunlight availability.&lt;/p&gt;
&lt;p&gt;Titled “&lt;a href="https://www.sciencedirect.com/science/article/pii/S0198971520303173" target="_blank" rel="noopener"&gt;3D city models for urban farming site identification in buildings&lt;/a&gt;”, their research paper was published in the journal Computers, Environment and Urban Systems, based on a proof of concept focused on a residential building situated at Jurong West in Singapore. Field surveys were carried out to validate the simulation figures.&lt;/p&gt;
&lt;p&gt;“We investigate whether vertical spaces of buildings comprising outdoor corridors, façades and windows receive sufficient photosynthetically active radiation (PAR) for growing food crops and do so at a high resolution, obtaining insights for hundreds of locations in a particular building,” shared the paper’s first author Mr Ankit Palliwal, who graduated from the &lt;a href="https://fass.nus.edu.sg/geog/" target="_blank" rel="noopener"&gt;NUS Geography&lt;/a&gt; with a Master of Science in Applied GIS.&lt;/p&gt;
&lt;p&gt;PAR is defined as the portion of solar spectrum in the 400 to 700 nm wavelength range, which is utilised by plants for photosynthesis. Its amount is a key factor to understand whether a location has the potential for farming and what kind of crops can be grown at a specific site because different crops require different PAR conditions for its optimal growth.&lt;/p&gt;
&lt;p&gt;“We conducted field measurements to verify the veracity of the simulations and concluded that 3D city models are a viable instrument for calculating the potential of spaces in buildings for urban farming, potentially replacing field surveys and doing so more efficiently. We were able to understand the farming conditions for each locality in a specific building without visiting it, and to decide which crops are suitable to be grown. For this particular building, we have identified locations that would be suitable for growing lettuce and sweet pepper. This research is the first instance in which 3D geoinformation has been used for this purpose, thus, we invented a new application of such data, which is becoming increasingly important in the context of smart cities,” shared Dr Biljecki, the principal investigator of the study.&lt;/p&gt;
&lt;figure id="figure-one-of-the-results-of-the-simulations-providing-insight-for-decision-making-for-high-rise-urban-farming-and-for-maximising-the-crop-yield"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="One of the results of the simulations, providing insight for decision-making for high-rise urban farming and for maximising the crop yield." srcset="
/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416gis3dmodel_hu_4ae68f66e5a34986.webp 400w,
/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416gis3dmodel_hu_264cb10c8f61740d.webp 760w,
/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416gis3dmodel_hu_f1bad9badd476330.webp 1200w"
src="https://ual.sg/post/2021/04/18/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/1920_20210416gis3dmodel_hu_4ae68f66e5a34986.webp"
width="760"
height="427"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
One of the results of the simulations, providing insight for decision-making for high-rise urban farming and for maximising the crop yield.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h4 id="potential-to-upscale-and-cover-entire-cities"&gt;Potential to upscale and cover entire cities&lt;/h4&gt;
&lt;p&gt;3D spatial data has an unparalleled advantage over doing field measurements when there are many locations to evaluate and especially when scaling up the estimations at the precinct or urban scale.&lt;/p&gt;
&lt;p&gt;Such analyses can be conducted using 3D models obtained from open data, and the simulations can be run using open-source software, facilitating replication elsewhere and scalability to cover entire cities, uncovering their urban farming potential and aid in devising planning strategies.&lt;/p&gt;
&lt;p&gt;“Another possible work direction would be to investigate if 3D city models can be combined with existing approaches to assess the suitability for installing solar panels and energy simulations, recommending the optimal mix and arrangement of photovoltaic installations and agricultural crops in the same building, presenting a holistic solution for supporting green buildings and sustainable development. Given the dynamic features of urban farming, and accompanying aspects such as real estate and microclimate factors, this work brings a new dimension to digital twin developments,” concluded Dr Biljecki.&lt;/p&gt;
&lt;p&gt;The study is a research project under the &lt;a href="https://ual.sg/" target="_blank" rel="noopener"&gt;Urban Analytics Lab&lt;/a&gt;, a multidisciplinary research group formed in 2019 which focuses on urban data science, 3D GIS, and digital twins. The research team for this study is led by Dr Filip Biljecki and comprises Mr Ankit Palliwal, as well as PhD candidate Ms Song Shuang and Assoc Prof Hugh Tan Tiang Wah from the &lt;a href="https://www.dbs.nus.edu.sg/" target="_blank" rel="noopener"&gt;NUS Biological Sciences&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Recent activities of the UAL: guest lectures and lab seminars</title><link>https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/</link><pubDate>Sat, 03 Apr 2021 09:00:28 +0800</pubDate><guid>https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/</guid><description>&lt;p&gt;We were quite active in the past months.
Besides our work being &lt;a href="https://ual.sg/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/"&gt;featured at a high profile event&lt;/a&gt;, we had a bunch of internal activities: we were fortunate to be able to host three guest speakers as part of our master of urban planning module on urban data science, and we had also organised a few lab seminars.&lt;/p&gt;
&lt;h3 id="guest-lectures"&gt;Guest lectures&lt;/h3&gt;
&lt;p&gt;The first guest lecture was given by Zhongwen Huang, an urban planner and Director of the Urban Redevelopment Authority (URA) Digital Planning Lab, on the topic of Urban Planning 2.0.
Very impressive work.&lt;/p&gt;
&lt;figure id="figure-director-zhongwen-huang-presenting-the-work-of-the-ura-digital-planning-lab"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Director Zhongwen Huang presenting the work of the URA Digital Planning Lab." srcset="
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/zhongwen-huang-guest-lecture_hu_b4c4766a103728fc.webp 400w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/zhongwen-huang-guest-lecture_hu_f6ea74980cc8ccfe.webp 760w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/zhongwen-huang-guest-lecture_hu_aa99a2c0c9956d24.webp 1200w"
src="https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/zhongwen-huang-guest-lecture_hu_b4c4766a103728fc.webp"
width="760"
height="377"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Director Zhongwen Huang presenting the work of the URA Digital Planning Lab.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Second, our &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt; presented &lt;a href="https://ual.sg/project/roofpedia/"&gt;Roofpedia&lt;/a&gt;, his ongoing project on mapping solar and green roofs around the world.
Abraham has developed an approach to automatically identify specific rooftops across 17 cities that contribute towards sustainable development.&lt;/p&gt;
&lt;figure id="figure-our-lab-member-abraham-presenting-the-methodology-and-results-of-roofpedia"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Our lab member Abraham presenting the methodology and results of Roofpedia." srcset="
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/abraham-guest-lecture_hu_1979103e4f709c23.webp 400w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/abraham-guest-lecture_hu_4d548535bb3a6eb.webp 760w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/abraham-guest-lecture_hu_c93d966848319ba9.webp 1200w"
src="https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/abraham-guest-lecture_hu_1979103e4f709c23.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Our lab member Abraham presenting the methodology and results of Roofpedia.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The third guest lecture was given by &lt;a href="https://twitter.com/dshkol" target="_blank" rel="noopener"&gt;Dmitry Shkolnik&lt;/a&gt;, a data scientist/economist at Grab, on the topic of &lt;em&gt;Carving out a niche with (and for) R at Grab&lt;/em&gt;.
Dmitry also presented his personal projects, e.g. R packages he has developed.
Make sure to check his &lt;a href="https://www.dshkol.com" target="_blank" rel="noopener"&gt;blog&lt;/a&gt; containing very cool analyses.&lt;/p&gt;
&lt;figure id="figure-carving-out-a-niche-with-and-for-r-at-grab-by-dmitry-shkolnik"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Carving out a niche with (and for) R at Grab, by Dmitry Shkolnik." srcset="
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/dmitry-guest-lecture_hu_4b18761ca13d0309.webp 400w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/dmitry-guest-lecture_hu_edb4547a06e3d6b2.webp 760w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/dmitry-guest-lecture_hu_d767de3000cfa0b8.webp 1200w"
src="https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/dmitry-guest-lecture_hu_4b18761ca13d0309.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Carving out a niche with (and for) R at Grab, by Dmitry Shkolnik.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="lab-seminars"&gt;Lab seminars&lt;/h3&gt;
&lt;p&gt;We are happy that, amid the pandemic, we could manage to organise an in-person lab seminar hosting an external speaker.
&lt;a href="https://people.utwente.nl/c.wu" target="_blank" rel="noopener"&gt;Cai Wu&lt;/a&gt;, a doctoral candidate at the University of Twente in the Netherlands, presented his work &lt;em&gt;Simulating The Urban Spatial Structure With Spatial Interaction&lt;/em&gt;.
His work includes also Singapore as a study area.
Cai, thanks for visiting us, we wish you success with your PhD journey.&lt;/p&gt;
&lt;figure id="figure-cai-wu-university-of-twente-presenting-his-research-at-our-lab-seminar"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Cai Wu (University of Twente) presenting his research at our lab seminar." srcset="
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/cai-wu-seminar_hu_b46b57dec7021946.webp 400w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/cai-wu-seminar_hu_3fb4a59855e2b850.webp 760w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/cai-wu-seminar_hu_6f3f7b94624327e9.webp 1200w"
src="https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/cai-wu-seminar_hu_b46b57dec7021946.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Cai Wu (University of Twente) presenting his research at our lab seminar.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Finally, our graduate students &lt;a href="https://ual.sg/author/koichi-ito/"&gt;Koichi Ito&lt;/a&gt; and &lt;a href="https://ual.sg/author/wangyang-chen/"&gt;Wangyang Chen&lt;/a&gt; updated us on their progress and plans for further research.
Both are working on master theses that advance the applications of deep learning in urban planning.
Wangyang is working on developing a deep learning-based method to classify urban morphology and relate it to urban vitality, while Koichi is investigating the usability of street view imagery for assessing bikeability (with Singapore and Tokyo as case studies).&lt;/p&gt;
&lt;figure id="figure-wangyang-presenting-his-research-at-our-lab-seminar"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Wangyang presenting his research at our lab seminar." srcset="
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/wangyang-seminar_hu_e6a4b1d82ad21601.webp 400w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/wangyang-seminar_hu_32be9752ed26eb7c.webp 760w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/wangyang-seminar_hu_22bda8d41f32df45.webp 1200w"
src="https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/wangyang-seminar_hu_e6a4b1d82ad21601.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Wangyang presenting his research at our lab seminar.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;figure id="figure-koichi-presenting-his-research-at-our-lab-seminar"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Koichi presenting his research at our lab seminar." srcset="
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/koichi-seminar_hu_a9c7a69adfc3a939.webp 400w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/koichi-seminar_hu_e040bc00c69ad569.webp 760w,
/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/koichi-seminar_hu_8ea2d4a75a860f55.webp 1200w"
src="https://ual.sg/post/2021/04/03/recent-activities-of-the-ual-guest-lectures-and-lab-seminars/koichi-seminar_hu_a9c7a69adfc3a939.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Koichi presenting his research at our lab seminar.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Thanks to everyone for sharing their cool work! &amp;#x1f64f;&lt;/p&gt;</description></item><item><title>Dr Filip Biljecki gave an invited talk at Geo Connect Asia 2021</title><link>https://ual.sg/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/</link><pubDate>Sun, 28 Mar 2021 19:15:16 +0800</pubDate><guid>https://ual.sg/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/</guid><description>&lt;p&gt;The Director of the NUS Urban Analytics Lab, Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; gave an invited talk at the debut of &lt;a href="https://www.geoconnectasia.com" target="_blank" rel="noopener"&gt;Geo Connect Asia&lt;/a&gt;.
The conference, conducted in hybrid mode at the Marina Bay Sands in Singapore, provided a strategic and collaborative platform for industry professionals, government agencies and start-ups to come together to build and develop knowledge and solutions for Asia’s geospatial &amp;amp; location intelligence markets.&lt;/p&gt;
&lt;p&gt;This was a high profile event, bringing together industry and government leaders in the geospatial domain from around the world.
The conference started with a talk by &lt;a href="https://en.wikipedia.org/wiki/Vivian_Balakrishnan" target="_blank" rel="noopener"&gt;Dr Vivian Balakrishnan&lt;/a&gt;, Singapore&amp;rsquo;s Minister for Foreign Affairs and Minister-in-Charge of the Smart Nation Initiative.&lt;/p&gt;
&lt;p&gt;UAL&amp;rsquo;s Dr Biljecki was the only speaker from academia, affirming our relevance and the hard work of &lt;a href="https://ual.sg/people"&gt;everyone at the Lab&lt;/a&gt;.
The talk was on assessing and benchmarking open geospatial data, giving the audience a sneak peak in our ongoing work.&lt;/p&gt;
&lt;p&gt;GCA 2021 was Singapore&amp;rsquo;s first large-scale industrial event in a long time, attracting 1000+ attendees in person, and many online across 55 countries, transcending well beyond Singapore and Southeast Asia.
It was also well publicised in the media &lt;sup id="fnref:1"&gt;&lt;a href="#fn:1" class="footnote-ref" role="doc-noteref"&gt;1&lt;/a&gt;&lt;/sup&gt; &lt;sup id="fnref:2"&gt;&lt;a href="#fn:2" class="footnote-ref" role="doc-noteref"&gt;2&lt;/a&gt;&lt;/sup&gt;.&lt;/p&gt;
&lt;p&gt;It was an immense privilege and honour to have our developments featured at the inaugural event of Geo Connect Asia and in front of such an esteemed audience.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1643_hu_a7d50867ea552f99.webp 400w,
/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1643_hu_1c38ddaf4a508dc3.webp 760w,
/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1643_hu_cd54476fd687e9bf.webp 1200w"
src="https://ual.sg/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1643_hu_a7d50867ea552f99.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Thanks to the sponsors and organisers for the invitation.
Congratulations on a great job amid the circumstances, especially on facilitating the smooth hybrid mode including attendees from so many countries, and seamlessly implementing enhancements such as contact tracing tokens, testing, and facial recognition.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1596_hu_257b2f8f67d62cae.webp 400w,
/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1596_hu_50df32f6ca6361ea.webp 760w,
/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1596_hu_3c7326491c7baee5.webp 1200w"
src="https://ual.sg/post/2021/03/28/dr-filip-biljecki-gave-an-invited-talk-at-geo-connect-asia-2021/IMG_1596_hu_257b2f8f67d62cae.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;div class="footnotes" role="doc-endnotes"&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id="fn:1"&gt;
&lt;p&gt;&lt;a href="https://www.straitstimes.com/singapore/transport/movement-tracking-wristbands-to-be-trialled-at-hybrid-mice-event" target="_blank" rel="noopener"&gt;https://www.straitstimes.com/singapore/transport/movement-tracking-wristbands-to-be-trialled-at-hybrid-mice-event&lt;/a&gt;&amp;#160;&lt;a href="#fnref:1" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:2"&gt;
&lt;p&gt;&lt;a href="https://www.channelnewsasia.com/news/singapore/covid-19-distancing-dongles-meetings-wef-geo-connect-asia-14480724" target="_blank" rel="noopener"&gt;https://www.channelnewsasia.com/news/singapore/covid-19-distancing-dongles-meetings-wef-geo-connect-asia-14480724&lt;/a&gt;&amp;#160;&lt;a href="#fnref:2" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;</description></item><item><title>Our department is among the top 10 globally and best in Asia</title><link>https://ual.sg/post/2021/03/04/our-department-is-among-the-top-10-globally-and-best-in-asia/</link><pubDate>Thu, 04 Mar 2021 19:30:16 +0800</pubDate><guid>https://ual.sg/post/2021/03/04/our-department-is-among-the-top-10-globally-and-best-in-asia/</guid><description>&lt;p&gt;It is great to learn that our department, &lt;a href="https://www.sde.nus.edu.sg/arch/" target="_blank" rel="noopener"&gt;NUS Architecture&lt;/a&gt;, is now among the &lt;a href="https://www.sde.nus.edu.sg/arch/news_and_events/news_ay2021_facultyachievements_nus-architecture-ranks-6th-in-the-world-by-2021-qs-world-university-rankings-by-subject/" target="_blank" rel="noopener"&gt;top 10 in the world and #1 in Asia&lt;/a&gt; according to the &lt;a href="https://www.topuniversities.com/subject-rankings/2021" target="_blank" rel="noopener"&gt;Quacquarelli Symonds (QS) World University Rankings by subject 2021&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Congratulations to all our colleagues and other highly ranked departments. &amp;#x1f44f;&lt;/p&gt;</description></item><item><title>Publication of the collection Emerging topics in 3D GIS</title><link>https://ual.sg/post/2021/02/17/publication-of-the-collection-emerging-topics-in-3d-gis/</link><pubDate>Wed, 17 Feb 2021 10:30:16 +0800</pubDate><guid>https://ual.sg/post/2021/02/17/publication-of-the-collection-emerging-topics-in-3d-gis/</guid><description>&lt;p&gt;The section of the &lt;a href="https://onlinelibrary.wiley.com/toc/14679671/2021/25/1" target="_blank" rel="noopener"&gt;latest issue of Transactions in GIS&lt;/a&gt;, titled &lt;em&gt;Emerging topics in 3D GIS&lt;/em&gt;, represents a collection of approaches to acquire, analyse, and utilise 3D geospatial and Building Information Modelling (BIM) data.&lt;/p&gt;
&lt;p&gt;The special issue has been edited by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, &lt;a href="https://www.sde.nus.edu.sg/arch/staffs/rudi-stouffs-dr/" target="_blank" rel="noopener"&gt;Rudi Stouffs&lt;/a&gt;, and &lt;a href="https://findanexpert.unimelb.edu.au/profile/99751-mohsen-kalantari-soltanieh" target="_blank" rel="noopener"&gt;Mohsen Kalantari&lt;/a&gt;, and it has been published in the February 2021 issue (vol. 25, issue 1) of the journal.&lt;/p&gt;
&lt;p&gt;The issue aims at providing an insight in the latest developments and applications in advanced 3D data and technologies, encompassing topics from 3D city models acquisition and processing to BIM and 3D spatial data analysis.
A part of this special issue arises from the &lt;a href="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/"&gt;3D GeoInfo 2019 Conference and the 2th BIM/GIS Integration Workshop organised in Singapore&lt;/a&gt;.
The authors &lt;a href="https://ual.sg/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/"&gt;were given an opportunity&lt;/a&gt; to extend their research papers published in &lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/IV-4-W8/" target="_blank" rel="noopener"&gt;the proceedings of the event&lt;/a&gt;, but this issue was also open to other researchers working on the state of the art of the topics covered by the event.&lt;/p&gt;
&lt;p&gt;This special issue consists of an editorial and eleven research papers (see below) that represent the latest emerging topics in 3D GIS and cover a number of important aspects of 3D spatial data science and BIM.
Six of these papers are open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;.&lt;/p&gt;
&lt;p&gt;The guest editors thank all authors and reviewers for their valuable contribution making this special issue possible.
We also want to express our gratitude to the editorial team of Transactions in GIS, especially the Editor-in-Chief John P. Wilson and the Editorial Board member Feng Chen-Chieh.
The participants and sponsors of 3D Singapore, and the Singapore Land Authority and the International Society for Photogrammetry and Remote Sensing, are gratefully acknowledged for their role in the event from which this special issue stems.&lt;/p&gt;
&lt;h2 id="list-of-papers"&gt;List of papers&lt;/h2&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12728" target="_blank" rel="noopener"&gt;Editorial&lt;/a&gt; by Filip Biljecki, Rudi Stouffs and Mohsen Kalantari. (&lt;a href="https://ual.sg/publication/2021-tgis-editorial/"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;)&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12636" target="_blank" rel="noopener"&gt;Area and volume computation of longitude–latitude grids and three‐dimensional meshes&lt;/a&gt;,
by Kevin Kelly and Bojan Šavrič.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12638" target="_blank" rel="noopener"&gt;Web‐based real‐time visualization of large‐scale weather radar data using 3D tiles&lt;/a&gt;
by Mingyue Lu, Xinhao Wang, Xintao Liu, Min Chen, Shuoben Bi, Yadong Zhang and Tengfei Lao.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12658" target="_blank" rel="noopener"&gt;Looking for a needle in a haystack: Probability density based classification and reconstruction of dormers from 3D point clouds&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;
by Youness Dehbi, Sonja Koppers and Lutz Plümer.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12664" target="_blank" rel="noopener"&gt;Room semantics inference using random forest and relational graph convolutional networks: A case study of research building&lt;/a&gt;
by Xuke Hu, Hongchao Fan, Alexey Noskov, Zhiyong Wang, Alexander Zipf, Fuqiang Gu and Jianga Shang.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12659" target="_blank" rel="noopener"&gt;Robust and fast reconstruction of complex roofs with active sampling from 3D point clouds&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;
by Youness Dehbi, André Henn, Gerhard Gröger, Viktor Stroh and Lutz Plümer.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12672" target="_blank" rel="noopener"&gt;Comparison of versioning methods to improve the information flow in the planning and building processes&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;
by Helen Eriksson, Jing Sun, Väino Tarandi and Lars Harrie.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12685" target="_blank" rel="noopener"&gt;Automatic filtering and 2D modeling of airborne laser scanning building point cloud&lt;/a&gt;
by Fayez Tarsha Kurdi, Mohammad Awrangjeb and Nosheen Munir.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12686" target="_blank" rel="noopener"&gt;Consistency grammar for 3D indoor model checking&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;
by Shayan Nikoohemat, Abdoulaye A. Diakité, Ville Lehtola, Sisi Zlatanova and George Vosselman.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12695" target="_blank" rel="noopener"&gt;Advances in techniques to formulate the watertight concept for cadastre&lt;/a&gt;
by Ali Asghari, Mohsen Kalantari and Abbas Rajabifard.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12719" target="_blank" rel="noopener"&gt;Improving trajectory estimation using 3D city models and kinematic point clouds&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;
by Lukas Lucks, Lasse Klingbeil, Lutz Plümer and Youness Dehbi.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://doi.org/10.1111/tgis.12723" target="_blank" rel="noopener"&gt;A modular graph transformation rule set for IFC‐to‐CityGML conversion&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;
by Helga Tauscher, Joie Lim and Rudi Stouffs.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;</description></item><item><title>New paper: 3dfier: automatic reconstruction of 3D city models</title><link>https://ual.sg/post/2021/01/28/new-paper-3dfier-automatic-reconstruction-of-3d-city-models/</link><pubDate>Thu, 28 Jan 2021 09:40:16 +0800</pubDate><guid>https://ual.sg/post/2021/01/28/new-paper-3dfier-automatic-reconstruction-of-3d-city-models/</guid><description>&lt;p&gt;A new collaborative paper in which we have been involved has been published:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Ledoux H, Biljecki F, Dukai B, Kumar K, Peters R, Stoter J, Commandeur T (2021): 3dfier: automatic reconstruction of 3D city models. &lt;em&gt;Journal of Open Source Software&lt;/em&gt; 6(57): 2866. &lt;a href="https://doi.org/10.21105/joss.02866" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.21105/joss.02866&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-joss-3-dfier/2021-joss-3-dfier.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The paper describes &lt;a href="https://github.com/tudelft3d/3dfier" target="_blank" rel="noopener"&gt;3dfier&lt;/a&gt;, an open-source software developed by &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;our friends at TU Delft&lt;/a&gt; to automatically generate 3D city models using a 2D GIS dataset and point cloud (extrusion).
Its advantages are speed, output in multiple formats (e.g. &lt;a href="https://www.cityjson.org" target="_blank" rel="noopener"&gt;CityJSON&lt;/a&gt;), and topological consistency.
Its code has been released as open-source, and it is available on its &lt;a href="https://github.com/tudelft3d/3dfier" target="_blank" rel="noopener"&gt;Github repo&lt;/a&gt;.
A video about the software with an example of the output is available &lt;a href="https://vimeo.com/181421237" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The paper was published in the &lt;a href="https://joss.theoj.org" target="_blank" rel="noopener"&gt;Journal of Open Source Software&lt;/a&gt;, a developer friendly, open access journal for research software packages.
The journal adheres to all open science principles &amp;#x1f44f;, e.g. their peer review process is entirely open; for an example, &lt;a href="https://github.com/openjournals/joss-reviews/issues/2866" target="_blank" rel="noopener"&gt;you can check the review process of our paper as a Github issue&lt;/a&gt;, which we find to be a very clever and commendable approach.&lt;/p&gt;
&lt;p&gt;The lead author is &lt;a href="https://3d.bk.tudelft.nl/hledoux/" target="_blank" rel="noopener"&gt;Dr Hugo Ledoux&lt;/a&gt; from the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group&lt;/a&gt; at the Delft University of Technology.&lt;/p&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Three-dimensional city models are essential to assess the impact that environmental factors will have on citizens, because they are the input to several simulation and prediction software. Examples of such environmental factors are noise (Stoter et al., 2008), wind (Garcı́a-Sánchez et al., 2014), air pollution (Ujang et al., 2013), and temperature (Hsieh et al., 2011; Lee et al., 2013).
However, those 3D models, which typically contain buildings and other man-made objects such as roads, overpasses, bridges, and trees, are in practice complex to obtain, and it is very time-consuming and tedious to reconstruct them manually.
The software 3dfier addresses this issue by automating the 3D reconstruction process. It takes 2D geographical datasets (e.g., topographic datasets) that consist of polygons and “3dfies” them (as in “making them three-dimensional”). The elevation is obtained from an aerial point cloud dataset, and the semantics of the polygons is used to perform the lifting to the third dimension, so that it is realistic. The resulting 3D dataset is semantically decomposed/labelled based on the input polygons, and together they form one(many) surface(s) that aim(s) to be error-free: no self-intersections, no gaps, etc. Several output formats are supported (including the international standards), and the 3D city models are optimised for use in different software.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-joss-3-dfier/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-joss-3-dfier/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/01/28/new-paper-3dfier-automatic-reconstruction-of-3d-city-models/page-one_hu_2d90d00fca2bf7f9.webp 400w,
/post/2021/01/28/new-paper-3dfier-automatic-reconstruction-of-3d-city-models/page-one_hu_6ad9c086bed00e9b.webp 760w,
/post/2021/01/28/new-paper-3dfier-automatic-reconstruction-of-3d-city-models/page-one_hu_7803e82c49ec6deb.webp 1200w"
src="https://ual.sg/post/2021/01/28/new-paper-3dfier-automatic-reconstruction-of-3d-city-models/page-one_hu_2d90d00fca2bf7f9.webp"
width="538"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_joss_3dfier&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Ledoux, Hugo and Biljecki, Filip and Dukai, Balázs and Kumar, Kavisha and Peters, Ravi and Stoter, Jantien and Commandeur, Tom}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.21105/joss.02866}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Journal of Open Source Software}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{57}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2866}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{3dfier: automatic reconstruction of 3D city models}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{6}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: 3D city models for urban farming site identification in buildings</title><link>https://ual.sg/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/</link><pubDate>Mon, 11 Jan 2021 17:00:16 +0800</pubDate><guid>https://ual.sg/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/</guid><description>&lt;p&gt;We have a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Palliwal A, Song S, Tan HTW, Biljecki F (2021): 3D city models for urban farming site identification in buildings. &lt;em&gt;Computers, Environment and Urban Systems&lt;/em&gt; 86: 101584. &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2020.101584" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.compenvurbsys.2020.101584&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-ceus-3-d-farming/2021-ceus-3-d-farming.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The first author is &lt;a href="https://ual.sg/author/ankit-palliwal/"&gt;Ankit Palliwal&lt;/a&gt;, who has graduated with an MSc in Applied GIS, and has completed his graduation project with us, on which this paper is based.&lt;/p&gt;
&lt;p&gt;The paper presents a new use case for 3D models: using them to identify locations in buildings suitable for urban farming, based on the light conditions, and potentially recommending the optimal crop to be cultivated at a particular location and estimating the yield.&lt;/p&gt;
&lt;div class="alert alert-note"&gt;
&lt;div&gt;
Update (April 2021): this research was featured by the National University of Singapore as a &lt;a href="https://news.nus.edu.sg/novel-use-of-3d-geoinformation-to-identify-urban-farming-sites/" target="_blank" rel="noopener"&gt;news item&lt;/a&gt;.
&lt;/div&gt;
&lt;/div&gt;
&lt;h3 id="highlights"&gt;Highlights&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Residential buildings are becoming an increasingly relevant venue for urban farming.&lt;/li&gt;
&lt;li&gt;The suitability of a particular site depends mostly on the level of sunlight availability therein.&lt;/li&gt;
&lt;li&gt;Conventional methods to assess farming potential involve field visits and time-consuming measurements.&lt;/li&gt;
&lt;li&gt;We demonstrate that 3D city models can replace field surveys by showcasing their new application.&lt;/li&gt;
&lt;li&gt;The approach using 3D city models has advantages such as enabling urban-scale estimations of the urban farming potential.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="abstract"&gt;Abstract&lt;/h3&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Studies have suggested that there is farming potential in urban residential buildings. However, these studies are limited in scope, require field visits and time-consuming measurements. Furthermore, they have not suggested ways to identify suitable sites on a larger scale let alone means of surveying numerous micro-locations across the same building. Using a case study area focused on high-rise buildings in Singapore, this paper examines a novel application of three-dimensional (3D) city models to identify suitable farming micro-locations (level and orientation) in residential buildings. We specifically investigate whether the vertical spaces of these buildings comprising outdoor corridors, façades and windows receive sufficient photosynthetically active radiation (PAR) for growing food crops and do so at a high resolution. We also analyze the spatio-temporal characteristics of PAR, and the impact of shadows and different weather conditions on PAR in the building. Environmental simulations on the 3D model of the study area indicated that the cumulative daily PAR or Daily Light Integral (DLI) at a location in the building was dependent on its orientation and shape, sun&amp;rsquo;s diurnal and annual motion, weather conditions, and shadowing effects of the building&amp;rsquo;s own façades and surrounding buildings. The DLI in the study area generally increased with building&amp;rsquo;s levels and, depending on the particular micro-location, was found suitable for growing moderately light-demanding crops such as lettuce and sweet pepper. These variations in DLI at different locations of the same building affirmed the need for such simulations. The simulations were validated with field measurements of PAR, and correlation coefficients between them exceeded 0.5 in most cases thus, making a case that 3D city models offer a promising practical solution to identifying suitable farming locations in residential buildings, and have the potential for urban-scale applications.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="paper"&gt;Paper&lt;/h3&gt;
&lt;p&gt;For more information, please see the &lt;a href="https://ual.sg/publication/2021-ceus-3-d-farming/"&gt;paper&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-ceus-3-d-farming/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/page-one_hu_855056bc172bd237.webp 400w,
/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/page-one_hu_4f43984f3b71f7fa.webp 760w,
/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/page-one_hu_dbfa9ec336e45b11.webp 1200w"
src="https://ual.sg/post/2021/01/11/new-paper-3d-city-models-for-urban-farming-site-identification-in-buildings/page-one_hu_855056bc172bd237.webp"
width="570"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_ceus_3d_farming&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Palliwal, Ankit and Song, Shuang and Tan, Hugh Tiang Wah and Biljecki, Filip}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.compenvurbsys.2020.101584}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Computers, Environment and Urban Systems}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{101584}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{3D city models for urban farming site identification in buildings}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{86}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>OGC seeks public comment on new CityJSON Community Standard</title><link>https://ual.sg/post/2021/01/09/ogc-seeks-public-comment-on-new-cityjson-community-standard/</link><pubDate>Sat, 09 Jan 2021 10:04:15 +0800</pubDate><guid>https://ual.sg/post/2021/01/09/ogc-seeks-public-comment-on-new-cityjson-community-standard/</guid><description>&lt;p&gt;We&amp;rsquo;re relaying &lt;a href="https://www.ogc.org/pressroom/pressreleases/4381" target="_blank" rel="noopener"&gt;a press release by the Open Geospatial Consortium&lt;/a&gt; about &lt;a href="https://cityjson.org" target="_blank" rel="noopener"&gt;CityJSON&lt;/a&gt;, which we have (on behalf of the National University of Singapore as OGC member) submitted to OGC for adoption as a &lt;a href="https://www.ogc.org/standards/community" target="_blank" rel="noopener"&gt;Community Standard&lt;/a&gt; together with 6 other organisations (full list below), and have been using in our work, e.g. &lt;a href="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/"&gt;to produce 3D data of Singapore&amp;rsquo;s public housing buildings&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="ogc-seeks-public-comment-on-new-cityjson-community-standard"&gt;OGC seeks public comment on new CityJSON Community Standard&lt;/h2&gt;
&lt;h4 id="release-date-7-january-2021"&gt;Release Date: 7 January 2021&lt;/h4&gt;
&lt;p&gt;The Open Geospatial Consortium (OGC) seeks public comment on its adoption of CityJSON as an &lt;a href="https://www.ogc.org/standards/community" target="_blank" rel="noopener"&gt;OGC Community Standard&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;CityJSON is a &lt;a href="https://json.org/" target="_blank" rel="noopener"&gt;JSON&lt;/a&gt;-based encoding for a subset of the &lt;a href="http://www.ogc.org/standards/citygml" target="_blank" rel="noopener"&gt;OGC CityGML data model&lt;/a&gt; (version 2.0.0). CityJSON defines ways to describe most of the common 3D features and objects found in cities (such as buildings, roads, rivers, bridges, vegetation, and city furniture) and the relationships between them. It also defines different standard levels of detail (LoDs) for the 3D objects, which allows different resolutions of objects for different applications and purposes.&lt;/p&gt;
&lt;p&gt;A CityJSON file describes both the geometry (an object’s form) and the semantics (an object’s function) of the features in a given area, such as the buildings, roads, rivers, vegetation, and city furniture.&lt;/p&gt;
&lt;p&gt;The aim of CityJSON is to offer an alternative to the GML encoding of CityGML, which can be verbose and complex to read and manipulate. CityJSON aims at being easy-to-use, both for reading datasets and for creating them. It was designed with programmers in mind, so that tools and APIs supporting it can be quickly built.&lt;/p&gt;
&lt;p&gt;The differences between the CityJSON v1.0 implementation and the XML-based implementation are described in more detail on &lt;a href="https://www.cityjson.org/citygml-compatibility" target="_blank" rel="noopener"&gt;this webpage&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Please note that the &lt;a href="https://www.ogc.org/pressroom/pressreleases/4370" target="_blank" rel="noopener"&gt;CityGML 3.0 Conceptual Model specification is also currently open for public comments&lt;/a&gt;. CityGML 3.0 is a data model that has been re-structured to be independent of particular encodings; it may have a JSON encoding in the future. CityJSON 1.0 provides a directly implementable JSON encoding based on the CityGML 2.0 data model, an OGC standard since 2012. GML encodings and JSON encodings address different use cases – different communities looking to achieve different things, each useful in their own right. Hence, the OGC community sees value in offering both CityGML 3.0 and CityJSON 1.0.&lt;/p&gt;
&lt;p&gt;CityJSON has been submitted to OGC for adoption as a Community Standard by the following organizations: Geonovum, Delft University of Technology, Kadaster International, virtualcitySYSTEMS, National University of Singapore, Forum Virium Helsinki Oy, and Ordnance Survey.&lt;/p&gt;
&lt;p&gt;An OGC Community Standard is an official Standard of OGC that is considered to be a widely used, mature specification, but was developed outside of OGC’s standards development process. The originator of the standard brings to OGC a “snapshot” of their work that is then endorsed by OGC membership so that it can become part of the OGC Standards Baseline.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;The candidate &lt;a href="https://portal.ogc.org/files/?artifact_id=95618&amp;amp;version=1" target="_blank" rel="noopener"&gt;CityJSON Community Standard 1.0&lt;/a&gt; is available for review and comment on the &lt;a href="https://portal.ogc.org/files/?artifact_id=95618&amp;amp;version=1" target="_blank" rel="noopener"&gt;OGC Portal&lt;/a&gt;. Comments are due by February 7, 2021, and should be submitted via the method outlined on the &lt;a href="https://www.ogc.org/standards/requests/222" target="_blank" rel="noopener"&gt;CityJSON Community Standard 1.0’s public comment request page&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;h3 id="about-ogc"&gt;About OGC&lt;/h3&gt;
&lt;p&gt;The Open Geospatial Consortium (OGC) is an international consortium of more than 500 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR - Findable, Accessible, Interoperable, and Reusable.
OGC’s member-driven consensus process creates royalty free, publicly available geospatial standards. Existing at the cutting edge, OGC actively analyzes and anticipates emerging tech trends, and runs an agile, collaborative Research and Development (R&amp;amp;D) lab that builds and tests innovative prototype solutions to members&amp;rsquo; use cases.
OGC members together form a global forum of experts and communities that use location to connect people with technology and improve decision-making at all levels. OGC is committed to creating a sustainable future for us, our children, and future generations.
Visit &lt;a href="http://ogc.org/" target="_blank" rel="noopener"&gt;ogc.org&lt;/a&gt; for more info on our work.&lt;/p&gt;</description></item><item><title>Two new papers: Reference study of IFC/CityGML software support</title><link>https://ual.sg/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/</link><pubDate>Mon, 04 Jan 2021 08:00:16 +0800</pubDate><guid>https://ual.sg/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/</guid><description>
&lt;figure id="figure-cross-software-comparison-of-exporting-ifc-models"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Cross-software comparison of exporting IFC models." srcset="
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/featured_hu_f68ea95c608c37ef.webp 400w,
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/featured_hu_842e3bd05be970e9.webp 760w,
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/featured_hu_9e5ae0c628d3debb.webp 1200w"
src="https://ual.sg/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/featured_hu_f68ea95c608c37ef.webp"
width="760"
height="676"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Cross-software comparison of exporting IFC models.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The &lt;a href="https://ual.sg/project/geobim-benchmark"&gt;GeoBIM Benchmark 2019&lt;/a&gt; is finalised with two papers published in tandem in Transactions in GIS.
These back-to-back papers describe our findings on the software support of IFC and CityGML.
There is also a &lt;a href="https://ual.sg/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/"&gt;third paper&lt;/a&gt;, announced previously, describing the part of the benchmark focusing on BIM-GIS Integration (IFC georeferencing and conversions).&lt;/p&gt;
&lt;p&gt;The lead author of both papers is &lt;a href="http://www.noardo.eu" target="_blank" rel="noopener"&gt;Dr Francesca Noardo&lt;/a&gt; from the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group&lt;/a&gt; at the Delft University of Technology.
We thank her, and the other members of the team, for involving us in their project and leading a series of papers well-documenting the insights gained during this topical and very interesting initiative.&lt;/p&gt;
&lt;p&gt;The project was endorsed and co-funded by the &lt;a href="https://www.isprs.org" target="_blank" rel="noopener"&gt;International Society for Photogrammetry and Remote Sensing (ISPRS)&lt;/a&gt; and the &lt;a href="http://www.eurosdr.net" target="_blank" rel="noopener"&gt;European Spatial Data Research (EuroSDR)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Both papers are described below, and both are (as well as the third paper), published as gold open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;.
Further, the final report of the project is available &lt;a href="https://ual.sg/publication/2020-geobim-final-report"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="part-i-reference-study-of-ifc-software-support"&gt;Part I: Reference study of IFC software support&lt;/h2&gt;
&lt;blockquote&gt;
&lt;p&gt;Noardo F, Krijnen T, Arroyo Ohori K, Biljecki F, Ellul C, Harrie L, Eriksson H, Polia L, Salheb N, Tauscher H, van Liempt J, Goerne H, Hintz D, Kaiser T, Leoni C, Warchol A, Stoter J (2021): Reference study of IFC software support: The GeoBIM benchmark 2019&amp;mdash;Part I. &lt;em&gt;Transactions in GIS&lt;/em&gt; 25(2): 805-841. &lt;a href="https://doi.org/10.1111/tgis.12709" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1111/tgis.12709&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-tgis-geobim-ifc/2021-tgis-geobim-ifc.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Abstract:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Industry Foundation Classes (IFC), the buildingSMART open standard for BIM, is underused with respect to its promising potential, since, according to the experience of practitioners and researchers working with BIM, issues in the standard’s implementation and use prevent its effective use. Nevertheless, a systematic investigation of these issues has never been carried out, and there is thus insufficient evidence for tackling the problems. The GeoBIM benchmark project is aimed at finding such evidence by involving external volunteers, reporting on various aspects of the behavior of tools (geometry, semantics, georeferencing, functionalities), analyzed and described in this article. Interestingly, different IFC software programs with the same standardized data sets yield inconsistent results, with few detectable common patterns, and significant issues are found in their support of the standard, probably due to the very high complexity of the standard data model. A companion article (Part II) describes the results of the benchmark related to CityGML, the counterpart of IFC within geoinformation.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2021-tgis-geobim-ifc/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-tgis-geobim-ifc/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task1-page-one_hu_424ecb45e1a012c3.webp 400w,
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task1-page-one_hu_78a36684a301386c.webp 760w,
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task1-page-one_hu_1b28f44632fec7a.webp 1200w"
src="https://ual.sg/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task1-page-one_hu_424ecb45e1a012c3.webp"
width="526"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_tgis_geobim_ifc&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Noardo, Francesca and Krijnen, Thomas and Arroyo Ohori, Ken and Biljecki, Filip and Ellul, Claire and Harrie, Lars and Eriksson, Helen and Polia, Lorenzo and Salheb, Nebras and Tauscher, Helga and van Liempt, Jordi and Goerne, Hendrik and Hintz, Dean and Kaiser, Tim and Leoni, Cristina and Warchol, Artur and Stoter, Jantien}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1111/tgis.12709}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Transactions in GIS}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{25}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{805-841}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Reference study of IFC software support: The GeoBIM benchmark 2019---Part I}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h2 id="part-ii-reference-study-of-citygml-software-support"&gt;Part II: Reference study of CityGML software support&lt;/h2&gt;
&lt;blockquote&gt;
&lt;p&gt;Noardo F, Arroyo Ohori K, Biljecki F, Ellul C, Harrie L, Krijnen T, Eriksson H, van Liempt J, Pla M, Ruiz A, Hintz D, Krueger N, Leoni C, Leoz L, Moraru D, Vitalis S, Willkomm P, Stoter J (2021): Reference study of CityGML software support: The GeoBIM benchmark 2019&amp;mdash;Part II. &lt;em&gt;Transactions in GIS&lt;/em&gt; 25(2): 842-868. &lt;a href="https://doi.org/10.1111/tgis.12710" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1111/tgis.12710&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-tgis-geobim-citygml/2021-tgis-geobim-citygml.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Abstract:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;OGC CityGML is an open standard for 3D city models intended to foster interoperability and support various applications. However, through our practical experience and discussions with practitioners, we have noticed several problems related to the implementation of the standard and the use of standardized data. Nevertheless, a systematic investigation of these issues has never been carried out, and there is thus insufficient evidence for tackling the problems. The GeoBIM benchmark project is aimed at finding such evidence by involving external volunteers, reporting on various aspects of the behavior of tools (geometry, semantics, georeferencing, functionalities), analyzed and described in this article. This study explicitly pointed out the critical points embedded in the format as an evidence base for future development. A companion article (Part I) describes the results of the benchmark related to IFC, the counterpart of CityGML within building information modeling.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2021-tgis-geobim-citygml/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-tgis-geobim-citygml/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task2-page-one_hu_edfd1a22965cce63.webp 400w,
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task2-page-one_hu_731251ff7882ef31.webp 760w,
/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task2-page-one_hu_88b8d5df67201861.webp 1200w"
src="https://ual.sg/post/2021/01/04/two-new-papers-reference-study-of-ifc/citygml-software-support/task2-page-one_hu_edfd1a22965cce63.webp"
width="528"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_tgis_geobim_citygml&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Noardo, Francesca and Arroyo Ohori, Ken and Biljecki, Filip and Ellul, Claire and Harrie, Lars and Krijnen, Thomas and Eriksson, Helen and van Liempt, Jordi and Pla, Maria and Ruiz, Antonio and Hintz, Dean and Krueger, Nina and Leoni, Cristina and Leoz, Leire and Moraru, Diana and Vitalis, Stelios and Willkomm, Philipp and Stoter, Jantien}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1111/tgis.12710}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Transactions in GIS}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{25}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;issue&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{842-868}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Reference study of CityGML software support: The GeoBIM benchmark 2019---Part II}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>SDE4 recognised as the first Zero Energy building in Southeast Asia</title><link>https://ual.sg/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/</link><pubDate>Wed, 02 Dec 2020 14:18:08 +0800</pubDate><guid>https://ual.sg/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/</guid><description>&lt;p&gt;We are proud that SDE4, the new building of our NUS School of Design and Environment (SDE), is the first one in Southeast Asia to achieve the stringent &lt;a href="https://living-future.org/zero-energy/certification/" target="_blank" rel="noopener"&gt;Zero Energy Certification&lt;/a&gt; by the prestigious &lt;a href="https://living-future.org/" target="_blank" rel="noopener"&gt;International Living Future Institute&lt;/a&gt;. &amp;#x1f4aa;&lt;/p&gt;
&lt;figure id="figure-nus-sde4-has-a-hybrid-cooling-system-that-makes-optimal-use-of-air-conditioning-text--photo-courtesy-of-the-nus-school-of-design-and-environment-and-the-straits-times"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="NUS SDE4 has a hybrid cooling system that makes optimal use of air-conditioning. Text &amp; photo: Courtesy of the NUS School of Design and Environment and The Straits Times." srcset="
/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/hybrid_cooling_system_hu_521dbff518f13668.webp 400w,
/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/hybrid_cooling_system_hu_c5896c023b309a9e.webp 760w,
/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/hybrid_cooling_system_hu_60f181bbd5b42475.webp 1200w"
src="https://ual.sg/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/hybrid_cooling_system_hu_521dbff518f13668.webp"
width="760"
height="349"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
NUS SDE4 has a hybrid cooling system that makes optimal use of air-conditioning. Text &amp;amp; photo: Courtesy of the NUS School of Design and Environment and The Straits Times.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;This is not the first accolade that we can be happy about.
SDE4 is also the 1st university building in the world to be &lt;a href="https://www.wellcertified.com/" target="_blank" rel="noopener"&gt;WELL Certified Gold&lt;/a&gt;. &amp;#x1f64c;&lt;/p&gt;
&lt;p&gt;Read more in the &lt;a href="https://news.nus.edu.sg/nus-sde4-is-first-in-southeast-asia-to-achieve-ilfi-zero-energy-certification/" target="_blank" rel="noopener"&gt;press release by NUS&lt;/a&gt; or in The Straits Times article &lt;a href="https://www.straitstimes.com/singapore/when-good-design-works-with-nature-to-shape-a-sustainable-future" target="_blank" rel="noopener"&gt;&lt;em&gt;When good design works with nature to shape a sustainable future&lt;/em&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-the-groundbreaking-nus-sde4-building-demonstrates-ways-of-rethinking-building-design-to-become-more-sustainable-text--photo-courtesy-of-the-nus-school-of-design-and-environment-and-the-straits-times"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="The groundbreaking NUS SDE4 building demonstrates ways of rethinking building design to become more sustainable. Text &amp; photo: Courtesy of the NUS School of Design and Environment and The Straits Times." srcset="
/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/nus_sde4_kv_hu_f36ee38500e6cc6.webp 400w,
/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/nus_sde4_kv_hu_9d27e487dabd2e21.webp 760w,
/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/nus_sde4_kv_hu_4b445b80a6a888b5.webp 1200w"
src="https://ual.sg/post/2020/12/02/sde4-recognised-as-the-first-zero-energy-building-in-southeast-asia/nus_sde4_kv_hu_f36ee38500e6cc6.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
The groundbreaking NUS SDE4 building demonstrates ways of rethinking building design to become more sustainable. Text &amp;amp; photo: Courtesy of the NUS School of Design and Environment and The Straits Times.
&lt;/figcaption&gt;&lt;/figure&gt;</description></item><item><title>New paper: Extending CityGML for IFC-sourced 3D city models</title><link>https://ual.sg/post/2020/11/16/new-paper-extending-citygml-for-ifc-sourced-3d-city-models/</link><pubDate>Mon, 16 Nov 2020 05:56:16 +0800</pubDate><guid>https://ual.sg/post/2020/11/16/new-paper-extending-citygml-for-ifc-sourced-3d-city-models/</guid><description>&lt;p&gt;We published a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Lim J, Crawford J, Moraru D, Tauscher H, Konde A, Adouane K, Lawrence S, Janssen P, Stouffs R (2021): Extending CityGML for IFC-sourced 3D city models. &lt;em&gt;Automation in Construction.&lt;/em&gt; 121: 103440. &lt;a href="https://doi.org/10.1016/j.autcon.2020.103440" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.autcon.2020.103440&lt;/a&gt; &lt;a href="https://ual.sg/publication/2021-autcon-ifc-citygml-ade/2021-autcon-ifc-citygml-ade.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This publication is one of the final papers stemming from the &lt;a href="https://ifc2citygml.github.io" target="_blank" rel="noopener"&gt;IFC2CityGML&lt;/a&gt; project (PI: Rudi Stouffs) carried out at NUS and conducted in collaboration with &lt;a href="https://www.ordnancesurvey.co.uk" target="_blank" rel="noopener"&gt;Ordnance Survey&lt;/a&gt;.
The lead author is &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The abstract follows:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Differences in the scope and intent of the contrasting IFC and CityGML data formats entail that converting the former to the latter results in loss of information.
However, for some use cases it is beneficial to keep also particular information from IFC that is not native to CityGML, and achieving that requires mechanisms such as the CityGML Application Domain Extension (ADE).
We develop an ADE to support retaining relevant information from IFC.
Besides being driven by the particular source of the input data (IFC), this multi-purpose ADE is shaped after a discovery process that involved examining potentially applicable use cases in Singapore, doubling as an extension that is adapted to a set of use cases and the local geographic context.
We implement the conceptual work by generating an enriched dataset (with an automatic conversion from IFC to CityGML), visualising it, and discuss its added value in a use case.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;As part of our commitment to open science, the publication is available in &lt;a href="https://ual.sg/publication/2021-autcon-ifc-citygml-ade/"&gt;green open access&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Other relevant papers from the project are available on the &lt;a href="https://ifc2citygml.github.io" target="_blank" rel="noopener"&gt;project&amp;rsquo;s website&lt;/a&gt;, including &lt;a href="https://ual.sg/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/"&gt;the publication on the developed web viewer&lt;/a&gt;, which is mentioned in the implementation section of the paper.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2021-autcon-ifc-citygml-ade/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/11/16/new-paper-extending-citygml-for-ifc-sourced-3d-city-models/page-one_hu_99854b465d28a3dc.webp 400w,
/post/2020/11/16/new-paper-extending-citygml-for-ifc-sourced-3d-city-models/page-one_hu_289c6212487e1909.webp 760w,
/post/2020/11/16/new-paper-extending-citygml-for-ifc-sourced-3d-city-models/page-one_hu_9b5ac5dd7452a89a.webp 1200w"
src="https://ual.sg/post/2020/11/16/new-paper-extending-citygml-for-ifc-sourced-3d-city-models/page-one_hu_99854b465d28a3dc.webp"
width="569"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2021_autcon_ifc_citygml_ade&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Biljecki, Filip and Lim, Joie and Crawford, James and Moraru, Diana and Tauscher, Helga and Konde, Amol and Adouane, Kamel and Lawrence, Simon and Janssen, Patrick and Stouffs, Rudi}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.autcon.2020.103440}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Automation in Construction}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{103440}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Extending CityGML for IFC-sourced 3D city models}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{121}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2021}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Meet our junior members</title><link>https://ual.sg/post/2020/11/03/meet-our-junior-members/</link><pubDate>Tue, 03 Nov 2020 15:00:28 +0800</pubDate><guid>https://ual.sg/post/2020/11/03/meet-our-junior-members/</guid><description>&lt;p&gt;We are happy to work with talented students to support our research activities.
We are pleased to introduce five new members of our team.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="zhong-shiyue"&gt;Zhong Shiyue&lt;/h2&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/11/03/meet-our-junior-members/profile_shiyue_hu_ed9779415105b425.webp 400w,
/post/2020/11/03/meet-our-junior-members/profile_shiyue_hu_f93f595b87b042db.webp 760w,
/post/2020/11/03/meet-our-junior-members/profile_shiyue_hu_ac6dfc09cf2382d9.webp 1200w"
src="https://ual.sg/post/2020/11/03/meet-our-junior-members/profile_shiyue_hu_ed9779415105b425.webp"
width="760"
height="526"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;blockquote&gt;
&lt;p&gt;“My passion for interdisciplinary research brought me on the path of research in GIS and psychology, and inspires me everyday to see the world from multiple angles!&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="bio"&gt;Bio&lt;/h3&gt;
&lt;p&gt;I am a graduate student pursuing a Masters in International Affairs at the Lee Kuan Yew School of Public Policy at NUS. I received my bachelor degree with Majors in International Studies, Philosophy, and Cognitive Science from Macalester College (St. Paul, Minnesota, USA). My working experience includes a role as Editorial Assistant at Journal of Asian Studies, Project Lead at Think Tank and Civil Societies Program, and Teaching Fellow at Breakthrough Twin Cities.&lt;/p&gt;
&lt;h3 id="activity"&gt;Activity&lt;/h3&gt;
&lt;p&gt;Understanding the impact of training and bias in mapping.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="lawrence-chew"&gt;Lawrence Chew&lt;/h2&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/11/03/meet-our-junior-members/profile_lawrence_hu_9861736d6a401245.webp 400w,
/post/2020/11/03/meet-our-junior-members/profile_lawrence_hu_cd7596e547f9eea8.webp 760w,
/post/2020/11/03/meet-our-junior-members/profile_lawrence_hu_9b6f3190d167fb63.webp 1200w"
src="https://ual.sg/post/2020/11/03/meet-our-junior-members/profile_lawrence_hu_9861736d6a401245.webp"
width="760"
height="526"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;blockquote&gt;
&lt;p&gt;“The Study of Geography is about more than just Memorising Places on a Map. It’s about Understanding the Complexity of our World.” - Barack Obama&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="bio-1"&gt;Bio&lt;/h3&gt;
&lt;p&gt;Year 4 NUS Geography undergraduate with a double minor in Geospatial Information Systems (GIS) &amp;amp; Urban Studies, with skill sets in Geospatial Analytics, Data Analysis and Cartography. I am especially interested in the area of Geospatial Analysis and Geospatial Database Management with the intention of value-adding in the area of Urban Planning and Development.&lt;/p&gt;
&lt;h3 id="activity-1"&gt;Activity&lt;/h3&gt;
&lt;p&gt;Exploration of building data openly released by governments.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="koichi-ito"&gt;Koichi Ito&lt;/h2&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/11/03/meet-our-junior-members/profile_koichi_hu_ad026d689ad7fda9.webp 400w,
/post/2020/11/03/meet-our-junior-members/profile_koichi_hu_3e18dc400e01ab30.webp 760w,
/post/2020/11/03/meet-our-junior-members/profile_koichi_hu_a08ebb42c7ab45c7.webp 1200w"
src="https://ual.sg/post/2020/11/03/meet-our-junior-members/profile_koichi_hu_ad026d689ad7fda9.webp"
width="760"
height="526"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;blockquote&gt;
&lt;p&gt;“It brings me joy to tap into the potential of emerging urban data with talented and passionate people at the Lab!”&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="bio-2"&gt;Bio&lt;/h3&gt;
&lt;p&gt;I am a year 2 student in the Master of Urban Planning program. My working experience includes internships at various IT startups in Japan, an international non-profit organization in Senegal, an urban development research internship at World Bank Group, and a software engineer position at a health care e-commerce startup. I hold a Bachelor of Arts degree in Liberal Arts from Soka University of America.&lt;/p&gt;
&lt;h3 id="activity-2"&gt;Activity&lt;/h3&gt;
&lt;p&gt;Applications of computer vision techniques and street-level imagery in urban studies.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="kay-lee"&gt;Kay Lee&lt;/h2&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/11/03/meet-our-junior-members/profile_kay_hu_8f74eebaac723154.webp 400w,
/post/2020/11/03/meet-our-junior-members/profile_kay_hu_57a9b2a2d4481ae6.webp 760w,
/post/2020/11/03/meet-our-junior-members/profile_kay_hu_a64df3cc02e7b247.webp 1200w"
src="https://ual.sg/post/2020/11/03/meet-our-junior-members/profile_kay_hu_8f74eebaac723154.webp"
width="760"
height="526"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;blockquote&gt;
&lt;p&gt;“Geospatial data is critical to gaining insight into how our cities work from the ground up as well as the processes that shape our lives.”&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="bio-3"&gt;Bio&lt;/h3&gt;
&lt;p&gt;I am a third-year undergraduate at Yale-NUS from Singapore, majoring in Urban Studies alongside a minor in Anthropology. I am interested in understanding our cities and contributing to positive change through both data-based and ethnographic research. I am keen on pursuing a career in city research and urban planning in the future.&lt;/p&gt;
&lt;h3 id="activity-3"&gt;Activity&lt;/h3&gt;
&lt;p&gt;Quality assessment and analysis of open building datasets.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id="chen-xinyu"&gt;Chen Xinyu&lt;/h2&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/11/03/meet-our-junior-members/profile_xinyu_hu_c28322d94bf98ca9.webp 400w,
/post/2020/11/03/meet-our-junior-members/profile_xinyu_hu_fb7962a798a633.webp 760w,
/post/2020/11/03/meet-our-junior-members/profile_xinyu_hu_f6d69a602ac4f193.webp 1200w"
src="https://ual.sg/post/2020/11/03/meet-our-junior-members/profile_xinyu_hu_c28322d94bf98ca9.webp"
width="760"
height="526"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;blockquote&gt;
&lt;p&gt;“I am very excited to use data and technical skills to explore urban dynamics and devise solutions to urban problems.”&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="bio-4"&gt;Bio&lt;/h3&gt;
&lt;p&gt;I am currently pursuing a Master of Science in Applied GIS degree. Before joining the Lab, I have spent one year studying in the Master of Urban Planning programme, and I have earned a Bachelor of Science degree in Human Geography and Urban Rural Planning from Sun Yat-sen University, China. I have participated in several spatial data science projects in the areas of crime, public health, and Airbnb. My working experience includes an internship of real estate consulting in JLL and an internship of strategy consulting in Detecon.&lt;/p&gt;
&lt;h3 id="activity-4"&gt;Activity&lt;/h3&gt;
&lt;p&gt;Teaching activities in Geographic Information Systems (GIS) and Cartography.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Welcome everyone!&lt;/p&gt;
&lt;p&gt;We might be &lt;a href="https://ual.sg/openings/ads/2020-student-researcher-gis/"&gt;accepting new student researchers next year&lt;/a&gt;, so keep an eye at our &lt;a href="https://ual.sg/openings"&gt;openings page&lt;/a&gt; if interested.&lt;/p&gt;</description></item><item><title>Recent activities of the Lab</title><link>https://ual.sg/post/2020/10/28/recent-activities-of-the-lab/</link><pubDate>Wed, 28 Oct 2020 21:00:28 +0800</pubDate><guid>https://ual.sg/post/2020/10/28/recent-activities-of-the-lab/</guid><description>&lt;p&gt;After we&amp;rsquo;ve enjoyed taking part in the
&lt;a href="https://ual.sg/post/2020/07/05/our-participation-at-the-state-of-the-map-2020/"&gt;State of the Map&lt;/a&gt;,
&lt;a href="https://ual.sg/post/2020/06/08/keynote-at-iacad-2020/"&gt;IACAD&lt;/a&gt;, and
&lt;a href="https://ual.sg/post/2020/09/13/participation-at-the-3d-geoinfo-2020-conference/"&gt;3D GeoInfo&lt;/a&gt;,
the ongoing work of the Lab was presented at other venues.&lt;/p&gt;
&lt;p&gt;Earlier in September, the activities of the NUS Urban Analytics Lab were presented at the &lt;a href="https://www.geoworks.sg/programmes-n-initiatives/singapore-geospatial-week" target="_blank" rel="noopener"&gt;Singapore Geospatial week&lt;/a&gt; (organised by SLA) and more recently at the Singapore Management University.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/10/28/recent-activities-of-the-lab/featured_hu_a3934df4c26e7380.webp 400w,
/post/2020/10/28/recent-activities-of-the-lab/featured_hu_f6a9d038f520de30.webp 760w,
/post/2020/10/28/recent-activities-of-the-lab/featured_hu_1dbd3c8cff5e9163.webp 1200w"
src="https://ual.sg/post/2020/10/28/recent-activities-of-the-lab/featured_hu_a3934df4c26e7380.webp"
width="671"
height="393"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Furthermore, the Lab was in the spotlight at the inaugural lecture at the Gadjah Mada University, given to the Indonesian Geomatics academic community.&lt;/p&gt;
&lt;p&gt;Thanks to all the parties for inviting us to share our work.&lt;/p&gt;
&lt;p&gt;Oh, and we also had our Lab seminar! &amp;#x1f60a;
&lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt; presented his very cool ongoing work on deriving new geospatial datasets to promote carbon neutrality and utilisation of underused spaces. &amp;#x270c;&amp;#xfe0f;&lt;/p&gt;
&lt;p&gt;Stay tuned for more!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/10/28/recent-activities-of-the-lab/lab-meeting_hu_c6ce36071cc3a5f7.webp 400w,
/post/2020/10/28/recent-activities-of-the-lab/lab-meeting_hu_a868d1e2fa4c379c.webp 760w,
/post/2020/10/28/recent-activities-of-the-lab/lab-meeting_hu_fe398b002882ae4b.webp 1200w"
src="https://ual.sg/post/2020/10/28/recent-activities-of-the-lab/lab-meeting_hu_c6ce36071cc3a5f7.webp"
width="760"
height="356"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>New report: Smart at Scale: Cities to Watch (25 Case Studies)</title><link>https://ual.sg/post/2020/10/27/new-report-smart-at-scale-cities-to-watch-25-case-studies/</link><pubDate>Tue, 27 Oct 2020 16:33:08 +0800</pubDate><guid>https://ual.sg/post/2020/10/27/new-report-smart-at-scale-cities-to-watch-25-case-studies/</guid><description>&lt;p&gt;The &lt;a href="https://www.weforum.org" target="_blank" rel="noopener"&gt;World Economic Forum (WEF)&lt;/a&gt; has published a new report: &lt;em&gt;Smart at Scale: Cities to Watch&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;The PI of the Lab, Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; took part in the research and in writing the report, as Fellow in the WEF Global Future Council on Cities and Urbanization.&lt;/p&gt;
&lt;p&gt;The Council on Cities and Urbanization has leveraged its knowledge, networks and the public to identify 25 leading smart city projects that have successfully moved beyond the pilot stage.
The Council created a platform on the Forum’s website to allow the public
to submit their case studies and asked all Council members to advise their networks and spread the word through their social media platforms.
These projects have leveraged critical success factors to move smart, sustainable and innovative initiatives to scale, including the following:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Public-private cooperation&lt;/li&gt;
&lt;li&gt;Ambitious and strategic actions to meet commitments of the Paris Agreement on climate change, the Sustainable Development Goals and the New Urban Agenda&lt;/li&gt;
&lt;li&gt;An innovative or future-oriented focus&lt;/li&gt;
&lt;li&gt;Scalability and a proven positive impact on social, environmental and economic aspects of the city&lt;/li&gt;
&lt;li&gt;Agile and smart governance, policy, technology, business models and financing&lt;/li&gt;
&lt;li&gt;Significant leadership and credibility with respect to the local context&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The report is available &lt;a href="https://ual.sg/publication/2020-wef/2020-wef.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Guest lecture by Jinal Foflia</title><link>https://ual.sg/post/2020/10/17/guest-lecture-by-jinal-foflia/</link><pubDate>Sat, 17 Oct 2020 20:00:28 +0800</pubDate><guid>https://ual.sg/post/2020/10/17/guest-lecture-by-jinal-foflia/</guid><description>&lt;p&gt;We are very pleased that &lt;a href="https://twitter.com/fofliajinal" target="_blank" rel="noopener"&gt;Ms Jinal Foflia&lt;/a&gt;, Senior Program Manager &amp;ndash; Geo at Grab has visited us to give a guest lecture as part of our module &lt;a href="https://ual.sg/teaching"&gt;&lt;em&gt;Geographic Information Systems (GIS) and Cartography&lt;/em&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Jinal has given a very exciting and engaging talk on her experience with OpenStreetMap and shared her vision on future developments, both from the personal and corporate perspective.&lt;/p&gt;
&lt;p&gt;Many thanks, Jinal.
You are welcome back at NUS at any time.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/10/17/guest-lecture-by-jinal-foflia/2_hu_7b59213690236751.webp 400w,
/post/2020/10/17/guest-lecture-by-jinal-foflia/2_hu_fc4c5c963745deeb.webp 760w,
/post/2020/10/17/guest-lecture-by-jinal-foflia/2_hu_67b4b4e668397092.webp 1200w"
src="https://ual.sg/post/2020/10/17/guest-lecture-by-jinal-foflia/2_hu_7b59213690236751.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>The PI of the Lab elected co-chair of the Open Geospatial Consortium 3DIM DWG</title><link>https://ual.sg/post/2020/09/23/the-pi-of-the-lab-elected-co-chair-of-the-open-geospatial-consortium-3dim-dwg/</link><pubDate>Wed, 23 Sep 2020 14:58:26 +0800</pubDate><guid>https://ual.sg/post/2020/09/23/the-pi-of-the-lab-elected-co-chair-of-the-open-geospatial-consortium-3dim-dwg/</guid><description>&lt;p&gt;The principal investigator of the &lt;a href="https://ual.sg/"&gt;NUS Urban Analytics Lab&lt;/a&gt;, Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; has been voted as co-chair of the
&lt;a href="https://www.ogc.org/projects/groups/3dimdwg" target="_blank" rel="noopener"&gt;3D Information Management (3DIM) Domain Working Group&lt;/a&gt; at the &lt;a href="https://www.ogc.org" target="_blank" rel="noopener"&gt;Open Geospatial Consortium (OGC)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;OGC is an international consortium of more than 500 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR - Findable, Accessible, Interoperable, and Reusable.&lt;/p&gt;
&lt;p&gt;The 3D Information Management (3DIM) Domain Working Group is facilitating the definition and development of interface and encoding standards that enable software to develop solutions that allow infrastructure owners, builders, emergency responders, community planners, and the travelling public to better manage and navigate complex built environments.&lt;/p&gt;</description></item><item><title>New paper: How might an LoD Logic Framework Help to Bridge the 3D Cadastre Research-to-Practice Gap?</title><link>https://ual.sg/post/2020/09/15/new-paper-how-might-an-lod-logic-framework-help-to-bridge-the-3d-cadastre-research-to-practice-gap/</link><pubDate>Tue, 15 Sep 2020 08:00:16 +0800</pubDate><guid>https://ual.sg/post/2020/09/15/new-paper-how-might-an-lod-logic-framework-help-to-bridge-the-3d-cadastre-research-to-practice-gap/</guid><description>&lt;p&gt;A new collaborative paper in which we have been involved has been published:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Stoter J, Biljecki F, Ho S (2020): How might an LoD Logic Framework Help to Bridge the 3D Cadastre Research-to-Practice Gap? A Proposal for a Level of Implementation Framework. &lt;em&gt;FIG Working Week 2020.&lt;/em&gt; &lt;a href="https://ual.sg/publication/2020-fig-lod/2020-fig-lod.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The paper is part of the &lt;a href="https://www.fig.net/fig2020/" target="_blank" rel="noopener"&gt;FIG Working Week 2020&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The lead author is &lt;a href="https://3d.bk.tudelft.nl/jstoter/" target="_blank" rel="noopener"&gt;Prof. dr. Jantien Stoter&lt;/a&gt; from the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group&lt;/a&gt; at the Delft University of Technology, and the work has been done also in collaboration with &lt;a href="https://www.rmit.edu.au/contact/staff-contacts/academic-staff/h/ho-dr-serene" target="_blank" rel="noopener"&gt;Dr Serene Ho&lt;/a&gt; (RMIT University).&lt;/p&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;During the past decade, hundreds of research papers have been published on the challenge of registering multi-level properties in land administration and cadastral registrations. In addition, many pilots have been carried out to show potential solutions. However, fundamental and standardised solutions for 3D cadastre are still rare. In this article we analyse the reasons for few 3D cadastre solutions in practice and we propose a 3D cadastre definition framework that can distinguish between different levels of 3D cadastre implementation depending on a specific context. Based on a level of detail logic, it supports an incremental pathway for the implementation of 3D cadastre solutions. We list the scope of the framework and finish with conclusions and future work.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2020-fig-lod/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2020-fig-lod/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/09/15/new-paper-how-might-an-lod-logic-framework-help-to-bridge-the-3d-cadastre-research-to-practice-gap/page-one_hu_782ba2328bd2b002.webp 400w,
/post/2020/09/15/new-paper-how-might-an-lod-logic-framework-help-to-bridge-the-3d-cadastre-research-to-practice-gap/page-one_hu_ac9c0d9f0ea74725.webp 760w,
/post/2020/09/15/new-paper-how-might-an-lod-logic-framework-help-to-bridge-the-3d-cadastre-research-to-practice-gap/page-one_hu_99fe549c12bd6bce.webp 1200w"
src="https://ual.sg/post/2020/09/15/new-paper-how-might-an-lod-logic-framework-help-to-bridge-the-3d-cadastre-research-to-practice-gap/page-one_hu_782ba2328bd2b002.webp"
width="536"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@inproceedings&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2020_fig_lod&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;address&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Amsterdam, the Netherlands}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Stoter, Jantien and Biljecki, Filip and Ho, Serene}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;booktitle&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{FIG Working Week 2020}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{1--13}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{How might an LoD Logic Framework Help to Bridge the 3D Cadastre Research-to-Practice Gap? A Proposal for a Level of Implementation Framework}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2020}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Tools for BIM-GIS Integration (IFC Georeferencing and Conversions): Results from the GeoBIM Benchmark 2019</title><link>https://ual.sg/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/</link><pubDate>Mon, 14 Sep 2020 08:00:16 +0800</pubDate><guid>https://ual.sg/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/</guid><description>
&lt;figure id="figure-views-of-the-myran-model-converted-to-citygml-by-the-test-agis-fme-ifcr-l1-visualized-in-azul-in-this-case-the-roof-is-missing"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Views of the Myran model converted to CityGML by the test AGIS-FME-IFCr-L1, visualized in azul. In this case, the roof is missing." srcset="
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/featured_hu_b26c6cee0f653f60.webp 400w,
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/featured_hu_7a16b3e09e2043da.webp 760w,
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/featured_hu_bfca50dd65a2cecd.webp 1200w"
src="https://ual.sg/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/featured_hu_b26c6cee0f653f60.webp"
width="760"
height="362"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Views of the Myran model converted to CityGML by the test AGIS-FME-IFCr-L1, visualized in azul. In this case, the roof is missing.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;A new collaborative paper in which we have been involved has been published:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Noardo F, Harrie L, Arroyo Ohori K, Biljecki F, Ellul C, Krijnen, T, Eriksson H, Guler D, Hintz D, Jadidi M, Pla M, Sanchez S, Soini V, Stouffs R, Tekavec J, Stoter J (2020): Tools for BIM-GIS Integration (IFC Georeferencing and Conversions): Results from the GeoBIM Benchmark 2019. &lt;em&gt;ISPRS International Journal of Geo-Information.&lt;/em&gt; 9(9): 502. &lt;a href="https://doi.org/10.3390/ijgi9090502" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.3390/ijgi9090502&lt;/a&gt; &lt;a href="https://ual.sg/publication/2020-ijgi-geobim-integration/2020-ijgi-geobim-integration.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The lead author is &lt;a href="http://www.noardo.eu" target="_blank" rel="noopener"&gt;Dr Francesca Noardo&lt;/a&gt; from the &lt;a href="https://3d.bk.tudelft.nl" target="_blank" rel="noopener"&gt;3D Geoinformation group&lt;/a&gt; at the Delft University of Technology.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Update 2020-09-26:&lt;/strong&gt; The paper was featured as the cover story of the September 2020 issue of IJGI!&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/cover_hu_bed431c1270865bc.webp 400w,
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/cover_hu_428b0e11b9df9a0a.webp 760w,
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/cover_hu_5acc979e00b75cac.webp 1200w"
src="https://ual.sg/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/cover_hu_bed431c1270865bc.webp"
width="536"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The integration of 3D city models with Building Information Models (BIM), coined as GeoBIM, facilitates improved data support to several applications, e.g., 3D map updates, building permits issuing, detailed city analysis, infrastructure design, context-based building design, to name a few. To solve the integration, several issues need to be tackled and solved, i.e., harmonization of features, interoperability, format conversions, integration of procedures. The GeoBIM benchmark 2019, funded by ISPRS and EuroSDR, evaluated the state of implementation of tools addressing some of those issues. In particular, in the part of the benchmark described in this paper, the application of georeferencing to Industry Foundation Classes (IFC) models and making consistent conversions between 3D city models and BIM are investigated, considering the OGC CityGML and buildingSMART IFC as reference standards. In the benchmark, sample datasets in the two reference standards were provided. External volunteers were asked to describe and test georeferencing procedures for IFC models and conversion tools between CityGML and IFC. From the analysis of the delivered answers and processed datasets, it was possible to notice that while there are tools and procedures available to support georeferencing and data conversion, comprehensive definition of the requirements, clear rules to perform such two tasks, as well as solid technological solutions implementing them, are still lacking in functionalities. Those specific issues can be a sensible starting point for planning the next GeoBIM integration agendas.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2020-ijgi-geobim-integration/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2020-ijgi-geobim-integration/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/page-one_hu_6d22e463d3756f52.webp 400w,
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/page-one_hu_e4d64ad0d546bc43.webp 760w,
/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/page-one_hu_dedafde75ddce00.webp 1200w"
src="https://ual.sg/post/2020/09/14/new-paper-tools-for-bim-gis-integration-ifc-georeferencing-and-conversions-results-from-the-geobim-benchmark-2019/page-one_hu_6d22e463d3756f52.webp"
width="537"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2020_ijgi_geobim_integration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Noardo, Francesca and Harrie, Lars and Ohori, Ken Arroyo and Biljecki, Filip and Ellul, Claire and Krijnen, Thomas and Eriksson, Helen and Guler, Dogus and Hintz, Dean and Jadidi, Mojgan A and Pla, Maria and Sanchez, Santi and Soini, Ville-Pekka and Stouffs, Rudi and Tekavec, Jernej and Stoter, Jantien}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.3390/ijgi9090502}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS International Journal of Geo-Information}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;number&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{502}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Tools for BIM-GIS Integration (IFC Georeferencing and Conversions): Results from the GeoBIM Benchmark 2019}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{9}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2020}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Participation at the 3D GeoInfo 2020 conference</title><link>https://ual.sg/post/2020/09/13/participation-at-the-3d-geoinfo-2020-conference/</link><pubDate>Sun, 13 Sep 2020 15:07:53 +0800</pubDate><guid>https://ual.sg/post/2020/09/13/participation-at-the-3d-geoinfo-2020-conference/</guid><description>&lt;p&gt;3D GeoInfo is a key conference in the 3D GIS community.
&lt;a href="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/"&gt;Last year we have been involved in its organisation in Singapore&lt;/a&gt;.
&lt;a href="https://www.ucl.ac.uk/3dgeoinfo/" target="_blank" rel="noopener"&gt;This year&lt;/a&gt;, the 15th instance, was organised by the University College London, together with the 3rd BIM/GIS Integration Workshop.&lt;/p&gt;
&lt;p&gt;We participated with two papers:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/"&gt;Exploration of open data in Southeast Asia to generate 3D building models&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://ual.sg/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/"&gt;Visualising detailed CityGML and ADE at the building scale&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;All the papers from the conference have been published open access in the &lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/VI-4-W1-2020/" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt; and &lt;a href="https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLIV-4-W1-2020/" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt;.
The event also included &lt;a href="https://www.ucl.ac.uk/3dgeoinfo/3d-geoinfo-2020-keynotes" target="_blank" rel="noopener"&gt;three insightful keynotes&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;It was quite interesting to follow the online format, the organisation of the event was top notch and made sure that not much is lost by moving the event to the virtual format.&lt;/p&gt;
&lt;p&gt;Congratulations to everyone, especially to the authors who received awards for their best paper/presentation.&lt;/p&gt;
&lt;p&gt;Furthermore, many thanks to the organisation team and the sponsors for making it possible.&lt;/p&gt;</description></item><item><title>New paper: Exploration of open data in Southeast Asia to generate 3D building models</title><link>https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/</link><pubDate>Sat, 12 Sep 2020 08:35:16 +0800</pubDate><guid>https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/</guid><description>
&lt;figure id="figure-the-completeness-of-building-levels-in-the-largest-cities-in-the-eleven-countries-each-map-shows-an-extent-of-approx-25-x-25-km-c-openstreetmap-contributors"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="The completeness of building levels in the largest cities in the eleven countries. Each map shows an extent of approx. 25 x 25 km. (c) OpenStreetMap contributors." srcset="
/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/featured_hu_6d386f90a27363c5.webp 400w,
/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/featured_hu_f3451fc6f0c3c17e.webp 760w,
/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/featured_hu_d5efbeed966d1541.webp 1200w"
src="https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/featured_hu_6d386f90a27363c5.webp"
width="760"
height="311"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
The completeness of building levels in the largest cities in the eleven countries. Each map shows an extent of approx. 25 x 25 km. (c) OpenStreetMap contributors.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;We published a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F (2020): Exploration of open data in Southeast Asia to generate 3D building models. &lt;em&gt;ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences.&lt;/em&gt; VI-4/W1-2020: 37-44. &lt;a href="https://doi.org/10.5194/isprs-annals-vi-4-w1-2020-37-2020" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-vi-4-w1-2020-37-2020&lt;/a&gt; &lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-3-d-asean/2020-3-dgeoinfo-3-d-asean.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This article investigates the current status of generating 3D building models across 11 countries in Southeast Asia from publicly available data, primarily volunteered geoinformation (OpenStreetMap). The following countries are analysed: Brunei, Cambodia, Indonesia, Laos, Malaysia, Myanmar, Philippines, Singapore, Thailand, Timor-Leste, and Vietnam. This cross-country study includes multiple spatial levels of analysis: country, town, and micro-level (smaller neighbourhood). The main finding is that authoritative data to generate 3D building models is almost non-existent while building completeness in OpenStreetMap is highly heterogeneous, yielding location-dependent conclusions. While in general just a fraction of mapped buildings has height information and none of the administrative areas provides sufficient information to generate 3D building models, on a micro-level some areas are fully complete, providing a high potential to generate 3D building models on a precinct scale, which may be useful for certain spatial analyses. Furthermore, some areas have high building completeness, requiring only half of the work necessary for the extrusion: the collection of building height attributes. As a part of this work, a semantic 3D building model of a selected set of buildings in Singapore has been generated and released as open data (CityJSON), and the developed code was open-sourced.&lt;/p&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-3-d-asean/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-3-d-asean/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/page-one_hu_fe0419b3b30d9d07.webp 400w,
/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/page-one_hu_8c693f9367d5b160.webp 760w,
/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/page-one_hu_604dbf8148203bab.webp 1200w"
src="https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/page-one_hu_fe0419b3b30d9d07.webp"
width="609"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2020_3dgeoinfo_3d_asean&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Biljecki, F}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-vi-4-w1-2020-37-2020}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{37--44}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Exploration of open data in Southeast Asia to generate 3D building models}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{VI-4/W1-2020}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2020}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>New paper: Visualising detailed CityGML and ADE at the building scale</title><link>https://ual.sg/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/</link><pubDate>Fri, 11 Sep 2020 08:35:16 +0800</pubDate><guid>https://ual.sg/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/</guid><description>
&lt;figure id="figure-file-5-viewed-using-3dcitydb-web-map-client-it-is-possible-to-set-it-up-to-display-a-set-of-selected-citygml-properties"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="File 5 viewed using 3DCityDB web-map-client. It is possible to set it up to display a set of selected CityGML properties." srcset="
/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/featured_hu_143c0997961ae70b.webp 400w,
/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/featured_hu_fd99abe0f7eba805.webp 760w,
/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/featured_hu_f8410034e2eb7d99.webp 1200w"
src="https://ual.sg/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/featured_hu_143c0997961ae70b.webp"
width="760"
height="426"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
File 5 viewed using 3DCityDB web-map-client. It is possible to set it up to display a set of selected CityGML properties.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;We published a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Lim J, Janssen P, Biljecki F (2020): Visualising detailed CityGML and ADE at the building scale. &lt;em&gt;ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences.&lt;/em&gt; XLIV-4/W1-2020: 83-90. &lt;a href="https://doi.org/10.5194/isprs-archives-xliv-4-w1-2020-83-2020" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-xliv-4-w1-2020-83-2020&lt;/a&gt; &lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-ade-visualisation/2020-3-dgeoinfo-ade-visualisation.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The lead author is Joie Lim from NUS.&lt;/p&gt;
&lt;p&gt;There is an increasing activity in developing workflows and implementations to convert BIM data into CityGML. However, there are still not many platforms that are suitable to view and interact with the detailed information stored as a result of such conversions, especially if an Application Domain Extension (ADE) is involved to support additional information. We investigated the ease of use and features supported by visualisation software and tools with CityGML and ADE support, and propose an approach to develop a tool that combines useful features using a set of generic rules that can extract CityGML ADE attributes. The work, while generic, is geared towards detailed architectural datasets sourced from BIM. We implemented the approach in a web-based viewer supporting the visualisation of CityGML datasets enriched with ADE features.&lt;/p&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-ade-visualisation/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-ade-visualisation/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/page-one_hu_188a25acd49dab62.webp 400w,
/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/page-one_hu_6ae222c8962c1d75.webp 760w,
/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/page-one_hu_ea978f97f017b4c2.webp 1200w"
src="https://ual.sg/post/2020/09/11/new-paper-visualising-detailed-citygml-and-ade-at-the-building-scale/page-one_hu_188a25acd49dab62.webp"
width="692"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2020_3dgeoinfo_ade_visualisation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Lim, J. and Janssen, P. and Biljecki, F.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-archives-xliv-4-w1-2020-83-2020}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{83--90}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Visualising detailed CityGML and ADE at the building scale}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{XLIV-4/W1-2020}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2020}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>SDE signed MOU with Esri Singapore to expand research in geospatial analytics</title><link>https://ual.sg/post/2020/09/02/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/</link><pubDate>Wed, 02 Sep 2020 18:46:47 +0800</pubDate><guid>https://ual.sg/post/2020/09/02/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/</guid><description>&lt;p&gt;We are happy that our &lt;a href="https://www.sde.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS School of Design and Environment&lt;/a&gt; signed a memorandum of understanding with &lt;a href="https://esrisingapore.com.sg" target="_blank" rel="noopener"&gt;Esri Singapore&lt;/a&gt; to strengthen collaboration.&lt;/p&gt;
&lt;p&gt;The press release by the &lt;a href="https://www.sde.nus.edu.sg/news/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/" target="_blank" rel="noopener"&gt;School&lt;/a&gt; follows:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The National University of Singapore School of Design and Environment (SDE) signed an MOU with Esri Singapore to strengthen SDE’s research in geospatial analytics.&lt;/p&gt;
&lt;p&gt;Since its establishment in 1969, SDE remains the only faculty in a Singapore university that provides a comprehensive multi-disciplinary offering of teaching and research in architecture and landscape architecture, urban planning and design, project and facilities management, building performance and sustainability, real estate finance and economics, and industrial design.&lt;/p&gt;
&lt;p&gt;Working together with Esri, the pioneer and global leader in Geographic Information System (GIS) technology, SDE aims to expand its research in geospatial analytics.&lt;/p&gt;
&lt;p&gt;Professor Lam Khee Poh, Dean of NUS SDE and Mr Thomas Pramotedham, Chief Executive Officer, Esri Singapore, signed the Memorandum of Understanding (MOU) on 1 September 2020.&lt;/p&gt;
&lt;p&gt;The MOU collaboration includes joint research and development projects in the area of geospatial data collection and integration, geospatial analytics in the built environment, 3D city modelling, digital twins, and dynamic and real-time spatial data at the building, district and urban scale; and application of geospatial technologies in urbanization, health, liveability, autonomous vehicles, urban planning, real estate, including collaborating on joint grant proposals to Singapore government agencies.&lt;/p&gt;
&lt;p&gt;“We are excited to work with Esri Singapore to expand SDE’s research in geospatial analytics. Keeping abreast with an ever-changing built environment, the collaboration will further strengthen SDE’s position as a leading global institution in shaping a resilient future. SDE will continue our collaborative outreach to contribute to a well and green and resilient future,” said Professor Lam.&lt;/p&gt;
&lt;p&gt;“Esri Singapore is currently working with a number of government and educational institutions as part of a larger effort to build a nation of spatial thinkers. As a long-time partner of NUS, we are proud to be part of this collaboration to drive geospatial enabled innovation in SDE’s multi-disciplinary initiatives. The adoption of a geospatial context enables SDE’s students with a holistic approach towards design and planning through a better understanding and insights on how environment influence design,” said Mr Pramotedham.&lt;/p&gt;
&lt;p&gt;Through this partnership, we look forward to nurturing a new generation of professionals who embraces spatial thinking to design and build resilient communities,” said Mr Pramotedham.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/09/02/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/0-3_hu_f3db9487f9c5cff3.webp 400w,
/post/2020/09/02/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/0-3_hu_93fc471c417a3267.webp 760w,
/post/2020/09/02/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/0-3_hu_212843358e3ace84.webp 1200w"
src="https://ual.sg/post/2020/09/02/sde-signed-mou-with-esri-singapore-to-expand-research-in-geospatial-analytics/0-3_hu_f3db9487f9c5cff3.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>We are accepting PhD candidates</title><link>https://ual.sg/post/2020/08/25/we-are-accepting-phd-candidates/</link><pubDate>Tue, 25 Aug 2020 10:14:48 +0800</pubDate><guid>https://ual.sg/post/2020/08/25/we-are-accepting-phd-candidates/</guid><description>&lt;p&gt;We are continuously looking for talented and highly motivated prospective graduate students with a matching background to join our research group.
The PhD candidates will work on compelling topics at the forefront of GIScience, 3D city modelling, and urban analytics.&lt;/p&gt;
&lt;p&gt;In exceptional cases, we will offer co-funding to make the PhD journey more comfortable.&lt;/p&gt;
&lt;p&gt;Read &lt;a href="https://ual.sg/opportunities/phd/"&gt;our page for prospective PhD applicants&lt;/a&gt; for more information and how to apply.&lt;/p&gt;</description></item><item><title>New graduation projects completed at our Lab</title><link>https://ual.sg/post/2020/08/23/new-graduation-projects-completed-at-our-lab/</link><pubDate>Sun, 23 Aug 2020 11:15:02 +0800</pubDate><guid>https://ual.sg/post/2020/08/23/new-graduation-projects-completed-at-our-lab/</guid><description>&lt;p&gt;We are proud to announce that in the last month, four students have completed their studies by carrying out a graduation project with us.&lt;/p&gt;
&lt;h3 id="using-3d-city-models-to-uncover-urban-farming-potential-in-public-housing-blocks-of-singapore"&gt;Using 3D city models to uncover urban farming potential in public housing blocks of Singapore&lt;/h3&gt;
&lt;p&gt;In his thesis, &lt;a href="https://ual.sg/author/ankit-palliwal/"&gt;Ankit Palliwal&lt;/a&gt; has worked on establishing a new application of &lt;a href="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/"&gt;3D city models&lt;/a&gt;: identification of locations in buildings that are suitable for urban farming.
This multidisciplinary work has been conducted in collaboration with the &lt;a href="http://www.dbs.nus.edu.sg" target="_blank" rel="noopener"&gt;NUS Department of Biological Sciences&lt;/a&gt;.
His thesis has been condensed into a paper &amp;ndash; its preprint is available on &lt;a href="https://arxiv.org/abs/2007.14203" target="_blank" rel="noopener"&gt;arXiv&lt;/a&gt;.
The work largely relies on OpenStreetMap and open-source tools, so it can be replicated elsewhere.&lt;/p&gt;
&lt;p&gt;Update (Jan 2021): this project has been &lt;a href="https://ual.sg/publication/2021-ceus-3-d-farming/"&gt;published as a journal paper in Computers, Environment and Urban Systems&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-solar-exposure-of-building-facades-of-a-public-housing-block-done-using-3d-building-models-in-the-study-area-these-results-are-a-critical-insight-for-decision-making-for-high-rise-urban-farming-and-for-maximizing-the-crop-yield-to-learn-more-read-the-preprinthttpsarxivorgabs200714203"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Solar exposure of building facades of a public housing block, done using 3D building models in the study area. These results are a critical insight for decision-making for high-rise urban farming and for maximizing the crop yield. To learn more, read the [preprint](https://arxiv.org/abs/2007.14203)." srcset="
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/facadesDLI_hu_1706d5518752f20c.webp 400w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/facadesDLI_hu_c0cc62d4a62ac6.webp 760w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/facadesDLI_hu_256aaf52e169923e.webp 1200w"
src="https://ual.sg/post/2020/08/23/new-graduation-projects-completed-at-our-lab/facadesDLI_hu_1706d5518752f20c.webp"
width="760"
height="424"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Solar exposure of building facades of a public housing block, done using 3D building models in the study area. These results are a critical insight for decision-making for high-rise urban farming and for maximizing the crop yield. To learn more, read the &lt;a href="https://arxiv.org/abs/2007.14203" target="_blank" rel="noopener"&gt;preprint&lt;/a&gt;.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="assessing-the-quality-of-openstreetmap-building-data-in-singapore"&gt;Assessing the quality of OpenStreetMap building data in Singapore&lt;/h3&gt;
&lt;p&gt;As the project above suggests, building data in OpenStreetMap can be very useful.
But how good is the quality of this dataset in Singapore?
Surprisingly, this topic hasn’t been investigated much until &lt;a href="https://ual.sg/author/ethan-chen-wai-hoong/"&gt;Ethan Chen Wai Hoong&lt;/a&gt; picked it as his graduation project.&lt;/p&gt;
&lt;p&gt;The key result is that virtually all public housing buildings in Singapore are mapped in OpenStreetMap.
There are many aspects that the research looked into, e.g. quality of the polygons, attributes, and relation of the assessed quality to demographics in a particular area to explain the variations.
To the extent of our knowledge, this is the first study on quality of OpenStreetMap in Singapore.
To learn more about Ethan&amp;rsquo;s work, check out his &lt;a href="https://ual.sg/post/2020/08/22/assessing-the-quality-of-openstreetmap-building-data-in-singapore/"&gt;report&lt;/a&gt;.&lt;/p&gt;
&lt;figure id="figure-mean-offset-distance-of-osm-building-data-and-the-reference-dataset-by-cells"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Mean offset distance of OSM building data and the reference dataset by cells." srcset="
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/mean_distance_hu_b26d4793c9c92b5d.webp 400w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/mean_distance_hu_cbcd9d96c264e0c1.webp 760w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/mean_distance_hu_99874db715684ed6.webp 1200w"
src="https://ual.sg/post/2020/08/23/new-graduation-projects-completed-at-our-lab/mean_distance_hu_b26d4793c9c92b5d.webp"
width="760"
height="509"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Mean offset distance of OSM building data and the reference dataset by cells.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="enhanced-population-estimation-beyond-counts-exploring-age-patterns"&gt;Enhanced population estimation beyond counts: exploring age patterns&lt;/h3&gt;
&lt;p&gt;&lt;a href="https://ual.sg/author/noee-szarka/"&gt;Noée Szarka&lt;/a&gt; studies GIS at the University of Edinburgh.
This semester she has been a visiting scholar at the &lt;a href="https://ual.sg/"&gt;NUS Urban Analytics Lab&lt;/a&gt;, carrying out a research on extending traditional population estimation methods by including the prediction of demographic data such as age.
Her research revealed that it is possible to enhance population predictions beyond traditional counts, thanks to drivers such as the density of particular amenities and the age of buildings.&lt;/p&gt;
&lt;figure id="figure-locations-of-student-care-facilities-in-singapore-one-of-the-indicators-of-age-which-was-used-in-noées-research"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Locations of student care facilities in Singapore, one of the indicators of age, which was used in Noée&amp;#39;s research." srcset="
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/student-care-facilities_hu_9b2a196602d51287.webp 400w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/student-care-facilities_hu_4d47cc13e223308e.webp 760w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/student-care-facilities_hu_97fddb22ec336491.webp 1200w"
src="https://ual.sg/post/2020/08/23/new-graduation-projects-completed-at-our-lab/student-care-facilities_hu_9b2a196602d51287.webp"
width="760"
height="502"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Locations of student care facilities in Singapore, one of the indicators of age, which was used in Noée&amp;rsquo;s research.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="height-inference-for-all-us-building-footprints-in-the-absence-of-height-data"&gt;Height inference for all US building footprints in the absence of height data&lt;/h3&gt;
&lt;p&gt;&lt;a href="https://nl.linkedin.com/in/imkelansky" target="_blank" rel="noopener"&gt;Imke Lánský&lt;/a&gt; graduated with an MSc in Geomatics at the Delft University of Technology.
We had a role in her graduation research, as it is closely related to our main project Large-scale 3D geospatial data for urban analytics.&lt;/p&gt;
&lt;p&gt;She has done a great job in developing a method to predict the heights of all buildings in the United States from various indicators such as the urban morphology.
The full-text of her thesis is available at the &lt;a href="https://repository.tudelft.nl/islandora/object/uuid:ddcae7d1-6cc8-42a7-8c1d-a922ec7551f0?collection=education" target="_blank" rel="noopener"&gt;TU Delft repository&lt;/a&gt;, together with the well-documented &lt;a href="https://github.com/ImkeLansky/USA-BuildingHeightInference" target="_blank" rel="noopener"&gt;code&lt;/a&gt; (Imke, kudos for making your research reproducible &amp;#x1f44d;).&lt;/p&gt;
&lt;figure id="figure-maps-showing-a-comparison-between-building-heights-in-the-reference-model-and-the-building-height-predictions-using-three-machine-learning-models-for-the-cbd-of-seattle-washington-imkes-thesis-is-full-of-nice-visuals-so-feel-free-to-check-it-outhttpsrepositorytudelftnlislandoraobjectuuidddcae7d1-6cc8-42a7-8c1d-a922ec7551f0collectioneducation-open-access"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Maps showing a comparison between building heights in the reference model and the building height predictions using three machine learning models for the CBD of Seattle, Washington. Imke&amp;#39;s thesis is full of nice visuals, so feel free to check [it out](https://repository.tudelft.nl/islandora/object/uuid:ddcae7d1-6cc8-42a7-8c1d-a922ec7551f0?collection=education) (open access)." srcset="
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/height-predictions_hu_16be5d5f2d919884.webp 400w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/height-predictions_hu_e7f355d6b88abc36.webp 760w,
/post/2020/08/23/new-graduation-projects-completed-at-our-lab/height-predictions_hu_7c19e5313095cfa5.webp 1200w"
src="https://ual.sg/post/2020/08/23/new-graduation-projects-completed-at-our-lab/height-predictions_hu_16be5d5f2d919884.webp"
width="760"
height="362"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Maps showing a comparison between building heights in the reference model and the building height predictions using three machine learning models for the CBD of Seattle, Washington. Imke&amp;rsquo;s thesis is full of nice visuals, so feel free to check &lt;a href="https://repository.tudelft.nl/islandora/object/uuid:ddcae7d1-6cc8-42a7-8c1d-a922ec7551f0?collection=education" target="_blank" rel="noopener"&gt;it out&lt;/a&gt; (open access).
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Congratulations to everyone on the awesome job and your degrees &amp;#x1f393; &amp;#x1f44f;.
We wish you all the best in your future career steps.&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id="looking-for-a-thesis--capstone-project-topic"&gt;Looking for a thesis / capstone project topic?&lt;/h3&gt;
&lt;p&gt;Are you an NUS student and would like to carry out a cool research as your graduation project?
As it was the case last year, in the new academic year we will be accepting a few motivated students to carry out a research project with us.
Feel free to check the &lt;a href="https://ual.sg/opportunities/student-projects/#theses-dissertations-and-capstone-projects"&gt;topics we offer&lt;/a&gt;, or you can propose your own idea.
We are also running a large project in which we have prospects for graduation research / capstone projects.&lt;/p&gt;
&lt;h3 id="looking-rather-for-a-student-researcher-job"&gt;Looking rather for a student researcher job?&lt;/h3&gt;
&lt;p&gt;We have announced an &lt;a href="https://ual.sg/opportunities/ads/2020-student-researcher-gis/"&gt;opening&lt;/a&gt; for a student researcher to assist us in our projects.&lt;/p&gt;</description></item><item><title>Assessing the quality of OpenStreetMap building data in Singapore</title><link>https://ual.sg/post/2020/08/22/assessing-the-quality-of-openstreetmap-building-data-in-singapore/</link><pubDate>Sat, 22 Aug 2020 10:15:02 +0800</pubDate><guid>https://ual.sg/post/2020/08/22/assessing-the-quality-of-openstreetmap-building-data-in-singapore/</guid><description>&lt;p&gt;A new &lt;a href="https://ual.sg/publication/2020-osm-sg-building-quality/"&gt;report&lt;/a&gt; on OpenStreetMap (OSM) data quality assessment in Singapore has been published.
The report describes a recent research project carried at the Lab by &lt;a href="https://ual.sg/author/ethan-chen-wai-hoong/"&gt;Ethan Chen Wai Hoong&lt;/a&gt;, focusing on residential buildings.
Its summary follows.&lt;/p&gt;
&lt;p&gt;As OpenStreetMap is getting increasingly popular due to its open-license nature and collaborative aspect, its data quality is increasingly under scrutiny from many geospatial enthusiasts and scientists.
Given that many web services and scientific researchers are relying on OpenStreetMap data as the primary data source, data inaccuracy would cause unforeseen problems.
Therefore, it is imperative to assess the quality of OpenStreetMap data in order to identify areas of improvement and to improve the reliability of OpenStreetMap data.
While the assessment of OpenStreetMap data quality is an ongoing task in many countries, there is a lack of such assessments in Singapore.
Therefore, this study was conducted to address this research gap.&lt;/p&gt;
&lt;p&gt;Five quality metrics of Housing &amp;amp; Development Board (HDB) buildings were studied and analysed as part of the assessment of OpenStreetMap building data quality in Singapore: completeness, positional accuracy, shape accuracy, orientation accuracy, and attribute accuracy.
The results of this study suggest that the completeness of HDB building data in Singapore is close to perfect, with 97.67% of the HDB blocks being mapped in OpenStreetMap.&lt;/p&gt;
&lt;p&gt;Taking all quality metrics into account, it was concluded from this study that the overall quality of HDB buildings in Singapore is fairly good, with some room for improvement.
With regard to improving the overall quality of OpenStreetMap data, this study recommends that the OpenStreetMap community explores building a data quality warning system for its users.
In addition, correlation analyses revealed that both the median age of planning areas and the mean age of HDB buildings have weak relationships with the data quality of HDB buildings in Singapore.
Furthermore, this study has also found that it is currently not feasible to use attributes of HDB buildings in OpenStreetMap to build &lt;a href="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/"&gt;semantically rich 3D building models&lt;/a&gt;, as these attributes are mostly unfilled.&lt;/p&gt;
&lt;p&gt;The full report is available &lt;a href="https://ual.sg/publication/2020-osm-sg-building-quality/"&gt;here&lt;/a&gt;.
This research has been conducted by &lt;a href="https://ual.sg/author/ethan-chen-wai-hoong/"&gt;Ethan Chen Wai Hoong&lt;/a&gt; as part of the GE6226 GIS Research Project module of MSc in Applied GIS programme in NUS.&lt;/p&gt;</description></item><item><title>Filip Biljecki appointed as Presidential Young Professor</title><link>https://ual.sg/post/2020/07/16/filip-biljecki-appointed-as-presidential-young-professor/</link><pubDate>Thu, 16 Jul 2020 06:49:03 +0800</pubDate><guid>https://ual.sg/post/2020/07/16/filip-biljecki-appointed-as-presidential-young-professor/</guid><description>&lt;p&gt;We are pleased to announce that Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, the founder and the principal investigator of the &lt;a href="https://ual.sg/"&gt;Lab&lt;/a&gt;, has been awarded the prestigious NUS Presidential Young Professorship (PYP).&lt;/p&gt;
&lt;p&gt;The PYP is conferred on exceptional early-career faculty with an excellent research track record.
This award strengthens the university&amp;rsquo;s support for the development of the &lt;a href="https://ual.sg/"&gt;Urban Analytics Lab&lt;/a&gt; and solidifies our research lines on 3D geospatial and urban data infrastructure and analysis.&lt;/p&gt;</description></item><item><title>Our participation at the State of the Map 2020</title><link>https://ual.sg/post/2020/07/05/our-participation-at-the-state-of-the-map-2020/</link><pubDate>Sun, 05 Jul 2020 23:18:49 +0800</pubDate><guid>https://ual.sg/post/2020/07/05/our-participation-at-the-state-of-the-map-2020/</guid><description>&lt;p&gt;&lt;a href="https://stateofthemap.org" target="_blank" rel="noopener"&gt;State of the Map (SOTM)&lt;/a&gt; is the yearly summit of the &lt;a href="https://www.openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap (OSM)&lt;/a&gt; community.
&lt;a href="https://2020.stateofthemap.org" target="_blank" rel="noopener"&gt;This year&amp;rsquo;s event&lt;/a&gt; moved online, together with an academic track.&lt;/p&gt;
&lt;p&gt;It has been a great pleasure that our Lab participated in it, with a talk by &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt;, co-authored by &lt;a href="https://ual.sg/author/li-min-ang/"&gt;Li Min Ang&lt;/a&gt;, presenting ongoing work which is part of our main project Large-scale 3D Geospatial Data for Urban Analytics.&lt;/p&gt;
&lt;p&gt;The papers of the academic track are published open access at &lt;a href="https://zenodo.org/communities/sotm-2020/" target="_blank" rel="noopener"&gt;Zenodo&lt;/a&gt;, including our &lt;a href="https://ual.sg/publication/2020-sotm-3-d/"&gt;contribution&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Besides the &lt;a href="https://2020.stateofthemap.org/programme/" target="_blank" rel="noopener"&gt;talks&lt;/a&gt;, consider checking out the &lt;a href="https://2020.stateofthemap.org/posters/" target="_blank" rel="noopener"&gt;posters&lt;/a&gt; as well.&lt;/p&gt;
&lt;p&gt;Thanks to the organisers for this wonderful event, and to the &lt;a href="https://2020.stateofthemap.org/#sponsors" target="_blank" rel="noopener"&gt;sponsors&lt;/a&gt; for supporting it!&lt;/p&gt;</description></item><item><title>A Comparison of Spatial Functions: PostGIS, Athena, PrestoDB, BigQuery vs RedShift</title><link>https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/</link><pubDate>Fri, 03 Jul 2020 16:50:51 +0800</pubDate><guid>https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/</guid><description>&lt;p&gt;We are moving our geospatial analysis to the cloud. As we are evaluating the various offerings by
&lt;a href="https://aws.amazon.com/" target="_blank" rel="noopener"&gt;AWS&lt;/a&gt; and &lt;a href="https://cloud.google.com/" target="_blank" rel="noopener"&gt;Google Cloud Platform&lt;/a&gt;, we wonder to what extent
these cloud database products support spatial functions in comparison to &lt;a href="https://postgis.net" target="_blank" rel="noopener"&gt;PostGIS&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Being the most widely used extended PostgreSQL object-relational database system, PostGIS allows spatial objects to be stored in the database and offers a wide range of functions for their analysis and processing.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://aws.amazon.com/athena/" target="_blank" rel="noopener"&gt;AWS Athena&lt;/a&gt;, &lt;a href="https://prestodb.io/" target="_blank" rel="noopener"&gt;PrestoDB&lt;/a&gt;,
&lt;a href="https://console.cloud.google.com/bigquery" target="_blank" rel="noopener"&gt;Google BigQuery&lt;/a&gt;, and &lt;a href="https://aws.amazon.com/redshift/" target="_blank" rel="noopener"&gt;AWS Redshift&lt;/a&gt;
are included in our considerations. Direct links to the respective documentation of currently supported
spatial functions are listed in the &lt;a href="#references"&gt;References&lt;/a&gt; section at the end of this post.&lt;/p&gt;
&lt;p&gt;Here is a summary of the comparison.&lt;/p&gt;
&lt;h3 id="geometrygeographybox-data-types"&gt;Geometry/Geography/Box Data Types&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp1-PostGIS-Data-Types_hu_3955054219fd855a.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp1-PostGIS-Data-Types_hu_d1a26de0cc6720ab.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp1-PostGIS-Data-Types_hu_b91be2e2fcce826e.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp1-PostGIS-Data-Types_hu_3955054219fd855a.webp"
width="760"
height="88"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="table-management-functions"&gt;Table Management Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp2-Table-Management_hu_b6a68e6f8f55e234.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp2-Table-Management_hu_d6ebcb6feb2aea3b.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp2-Table-Management_hu_792d6d8efb8afccd.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp2-Table-Management_hu_b6a68e6f8f55e234.webp"
width="760"
height="103"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-constructors"&gt;Geometry Constructors&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp3-Constructors_hu_43fc3a3f7569ab27.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp3-Constructors_hu_60b7c931aab31d2.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp3-Constructors_hu_4bc7ceed20e3ea28.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp3-Constructors_hu_43fc3a3f7569ab27.webp"
width="760"
height="220"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-accessors"&gt;Geometry Accessors&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors1_hu_1b2df276f90be61f.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors1_hu_c6bee59fc0050a8d.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors1_hu_e715a4a2adf8cb21.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors1_hu_1b2df276f90be61f.webp"
width="760"
height="409"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors2_hu_5dc42bfe86b6cda5.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors2_hu_f7368b8734241e05.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors2_hu_c54817272b81fa9e.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp4-Accessors2_hu_5dc42bfe86b6cda5.webp"
width="760"
height="292"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-editors"&gt;Geometry Editors&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp5-Editors_hu_2b9aead034e42ec3.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp5-Editors_hu_d85607e3a23f44c5.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp5-Editors_hu_82d08337ad047654.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp5-Editors_hu_2b9aead034e42ec3.webp"
width="760"
height="380"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-validation"&gt;Geometry Validation&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp6-Validation_hu_c1a5898bfc738186.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp6-Validation_hu_f1a2e08b4cbb1ca7.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp6-Validation_hu_40c2723244950905.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp6-Validation_hu_c1a5898bfc738186.webp"
width="760"
height="58"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="spatial-reference-system-functions"&gt;Spatial Reference System Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp7-Spatial-Reference-System_hu_4e67a014c960722e.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp7-Spatial-Reference-System_hu_9ef391dcfa164703.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp7-Spatial-Reference-System_hu_a0bd8980063564cb.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp7-Spatial-Reference-System_hu_4e67a014c960722e.webp"
width="760"
height="58"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-input"&gt;Geometry Input&lt;/h3&gt;
&lt;h4 id="well-known-text-wkt"&gt;&lt;em&gt;Well-Known Text (WKT)&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8a-Input-WKT_hu_7bbc6af8b5e9ca71.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8a-Input-WKT_hu_39b904ec4faefb3b.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8a-Input-WKT_hu_c1222046082e80f7.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8a-Input-WKT_hu_7bbc6af8b5e9ca71.webp"
width="760"
height="234"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h4 id="well-known-binary-wkb"&gt;&lt;em&gt;Well-Known Binary (WKB)&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8b-Input-WKB_hu_dbd225757d871105.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8b-Input-WKB_hu_95ec25327c0e4f03.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8b-Input-WKB_hu_8f6b0e6a23b48ff4.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8b-Input-WKB_hu_dbd225757d871105.webp"
width="760"
height="117"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h4 id="other-formats"&gt;&lt;em&gt;Other Formats&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8c-Input-Others_hu_df0ced2767ca1056.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8c-Input-Others_hu_9dbdbb5bfa34127a.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8c-Input-Others_hu_ce93bf358c5832ad.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp8c-Input-Others_hu_df0ced2767ca1056.webp"
width="760"
height="146"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-output"&gt;Geometry Output&lt;/h3&gt;
&lt;h4 id="well-known-text-wkt-1"&gt;&lt;em&gt;Well-Known Text (WKT)&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9a-Output-WKT_hu_7df91a244ab46121.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9a-Output-WKT_hu_3b8e23e10cd4bc11.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9a-Output-WKT_hu_db51da4d0ca7438d.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9a-Output-WKT_hu_7df91a244ab46121.webp"
width="760"
height="59"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h4 id="well-known-binary-wkb-1"&gt;&lt;em&gt;Well-Known Binary (WKB)&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9b-Output-WKB_hu_aaf0e98a505b9690.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9b-Output-WKB_hu_d36c5ae96d4af8fc.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9b-Output-WKB_hu_bb8c1cd7019a51f.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9b-Output-WKB_hu_aaf0e98a505b9690.webp"
width="760"
height="44"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h4 id="other-formats-1"&gt;&lt;em&gt;Other Formats&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9c-Output-Others_hu_3a0d756f80a1dd55.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9c-Output-Others_hu_1b7c1bd648462d0.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9c-Output-Others_hu_681e3eed901f7efe.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp9c-Output-Others_hu_3a0d756f80a1dd55.webp"
width="760"
height="189"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="operators"&gt;Operators&lt;/h3&gt;
&lt;h4 id="bounding-box-operators"&gt;&lt;em&gt;Bounding Box Operators&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10a-Operators-Box_hu_af3643f909fe6a20.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10a-Operators-Box_hu_2b6a879a40f2c31d.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10a-Operators-Box_hu_f4ea2381bf89bce4.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10a-Operators-Box_hu_af3643f909fe6a20.webp"
width="760"
height="395"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h4 id="distance-operators"&gt;&lt;em&gt;Distance Operators&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10b-Operators-Distance_hu_af2aa50a30440614.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10b-Operators-Distance_hu_8047c6536dbf487a.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10b-Operators-Distance_hu_43dd1444c0453087.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp10b-Operators-Distance_hu_af2aa50a30440614.webp"
width="760"
height="88"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="spatial-relationships"&gt;Spatial Relationships&lt;/h3&gt;
&lt;h4 id="topological-relationships"&gt;&lt;em&gt;Topological Relationships&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11a-Relationships-Topological_hu_8b65ffeec647159b.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11a-Relationships-Topological_hu_9654a4e7c57cd1b7.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11a-Relationships-Topological_hu_86a37128c6e3683d.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11a-Relationships-Topological_hu_8b65ffeec647159b.webp"
width="760"
height="277"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h4 id="distance-relationship"&gt;&lt;em&gt;Distance Relationship&lt;/em&gt;&lt;/h4&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11b-Relationships-Distance_hu_f7485e506152f6b4.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11b-Relationships-Distance_hu_95f66537ae5eac79.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11b-Relationships-Distance_hu_e3bed3cc4537cc54.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp11b-Relationships-Distance_hu_f7485e506152f6b4.webp"
width="760"
height="74"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="measurement-functions"&gt;Measurement Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp12-Measurement_hu_e91be9b0823dbe04.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp12-Measurement_hu_cf1c80a1c7783d3d.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp12-Measurement_hu_9b231249bcc14ed4.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp12-Measurement_hu_e91be9b0823dbe04.webp"
width="760"
height="409"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="geometry-processing"&gt;Geometry Processing&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing1_hu_a5ac898713538557.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing1_hu_1d4af1cea021be12.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing1_hu_e8555d87e1425964.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing1_hu_a5ac898713538557.webp"
width="760"
height="454"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing2_hu_1a321555f977f2d0.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing2_hu_3b53e044ec16e8fe.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing2_hu_82c7ba7b59c276f0.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp13-Processing2_hu_1a321555f977f2d0.webp"
width="760"
height="176"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="affine-transformations"&gt;Affine Transformations&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp14-Affine-Transformations_hu_ceba027dd1f6819c.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp14-Affine-Transformations_hu_a9429ce50a360907.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp14-Affine-Transformations_hu_2688d653f4aff75e.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp14-Affine-Transformations_hu_ceba027dd1f6819c.webp"
width="760"
height="132"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="clustering-functions"&gt;Clustering Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp15-Clustering_hu_64037760c9281f02.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp15-Clustering_hu_bc989c5cca7a2430.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp15-Clustering_hu_2af5ba816b35b754.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp15-Clustering_hu_64037760c9281f02.webp"
width="760"
height="73"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="bounding-box-functions"&gt;Bounding Box Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp16-Bounding-Box_hu_738a0ab29a9548c3.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp16-Bounding-Box_hu_d7ac692ac7c784f.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp16-Bounding-Box_hu_de578f0dda18b35a.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp16-Bounding-Box_hu_738a0ab29a9548c3.webp"
width="760"
height="220"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="linear-referencing"&gt;Linear Referencing&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp17-Linear-Referencing_hu_39aac04146dff415.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp17-Linear-Referencing_hu_e3bb45c771b16587.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp17-Linear-Referencing_hu_aca5594aa618581e.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp17-Linear-Referencing_hu_39aac04146dff415.webp"
width="760"
height="162"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="trajectory-functions"&gt;Trajectory Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp18-Trajectory_hu_b054c659f320643b.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp18-Trajectory_hu_885c24b934286993.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp18-Trajectory_hu_42851cd5b81b347f.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp18-Trajectory_hu_b054c659f320643b.webp"
width="760"
height="73"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="sfcgal-functions"&gt;SFCGAL Functions&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp19-SFCGAL_hu_3d0ff173bace074e.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp19-SFCGAL_hu_a42e4ec8a0743c54.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp19-SFCGAL_hu_a5f9733a7941d98a.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp19-SFCGAL_hu_3d0ff173bace074e.webp"
width="760"
height="263"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="long-transaction-support"&gt;Long Transaction Support&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp20-Long-Transaction_hu_5bddc69119168e21.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp20-Long-Transaction_hu_41ac523137da5070.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp20-Long-Transaction_hu_19865ddc2fd6d38f.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp20-Long-Transaction_hu_5bddc69119168e21.webp"
width="760"
height="88"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="aggregation"&gt;Aggregation&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp21-Aggregation_hu_1fb3f0ecaf77f344.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp21-Aggregation_hu_d991c41438f0b0e5.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp21-Aggregation_hu_a446cd86a5ace70b.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp21-Aggregation_hu_1fb3f0ecaf77f344.webp"
width="760"
height="59"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;h3 id="ms-big-tiles"&gt;MS Big Tiles&lt;/h3&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp22-MS-Bing-Tiles_hu_2992304eb9b85000.webp 400w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp22-MS-Bing-Tiles_hu_dcd647f9372cbfcb.webp 760w,
/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp22-MS-Bing-Tiles_hu_934c77e799a7ae27.webp 1200w"
src="https://ual.sg/post/2020/07/03/a-comparison-of-spatial-functions-postgis-athena-prestodb-bigquery-vs-redshift/comp22-MS-Bing-Tiles_hu_2992304eb9b85000.webp"
width="760"
height="132"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Although AWS and Google Cloud Platform have started to support spatial functions in their product offerings, there
are still quite some spatial functions missing as the time this post is written. Hopefully more of these currently
missing functions will be available soon.
&lt;br /&gt;
&lt;br /&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;br /&gt;
&lt;a name="references"&gt;References&lt;/a&gt;:
- [PostGIS Reference](https://postgis.net/docs/reference.html)
- [AWS Athena Spatial Functions](https://docs.aws.amazon.com/athena/latest/ug/geospatial-functions-list.html)
- [Presto DB Geospatial Functions](https://prestodb.io/docs/current/functions/geospatial.html)
- [Google BigQuery Geography Functions](https://cloud.google.com/bigquery/docs/reference/standard-sql/geography_functions)
- [AWS Redshift Spatial Functions](https://docs.aws.amazon.com/redshift/latest/dg/geospatial-functions.html)
&lt;br /&gt;
&lt;br /&gt;</description></item><item><title>Welcome, Abraham</title><link>https://ual.sg/post/2020/07/02/welcome-abraham/</link><pubDate>Thu, 02 Jul 2020 09:39:03 +0800</pubDate><guid>https://ual.sg/post/2020/07/02/welcome-abraham/</guid><description>&lt;p&gt;We are pleased to welcome &lt;a href="https://ual.sg/author/abraham-noah-wu/"&gt;Abraham Noah Wu&lt;/a&gt; as a new full-time researcher in the group, working on our flagship project on generating large-scale 3D city models.&lt;/p&gt;
&lt;p&gt;Abraham has just graduated from NUS with a Master in Architecture.
During his studies, he spent a semester at ETH Zurich, and worked on several projects.&lt;/p&gt;
&lt;p&gt;Welcome, Abraham!&lt;/p&gt;</description></item><item><title>Guide for open urban data in Singapore</title><link>https://ual.sg/post/2020/06/24/guide-for-open-urban-data-in-singapore/</link><pubDate>Wed, 24 Jun 2020 08:04:48 +0800</pubDate><guid>https://ual.sg/post/2020/06/24/guide-for-open-urban-data-in-singapore/</guid><description>&lt;div class="alert alert-note"&gt;
&lt;div&gt;
TL;DR: In the spirit of academia and open science, we’re making our notes on open data in Singapore public, and intend to keep them updated.
Feel free to visit in future to check for updates as the list grows.
&lt;/div&gt;
&lt;/div&gt;
&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;In our research and teaching activities that are focused on Singapore, we rely almost entirely on open data, enabling reproducibility and fostering open science.
We created a guide for open urban datasets to help navigate through all the resources.&lt;/p&gt;
&lt;p&gt;While &lt;a href="https://data.gov.sg" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; (the open data portal of the Singapore Government) is thorough and it is the starting and ending point to obtain many useful datasets, it might take time to get an overview and the availability of open data goes beyond that.
Furthermore, there are some particularities that may not be evident at first and which we elaborate on in the text (e.g. some datasets are available at multiple locations with slight differences).&lt;/p&gt;
&lt;p&gt;This index may be useful to novices to get an overview of what&amp;rsquo;s available in Singapore, but also to seasoned urban scientists who may learn about datasets they might not have been aware of.&lt;/p&gt;
&lt;p&gt;The data sources can be grouped into the following categories.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; &amp;ndash; the Government&amp;rsquo;s Open Data portal, containing almost 2000 datasets on myriads of topics from dozens of public organisations. Many datasets are regularly updated. There are some GIS datasets too, and also APIs providing real-time data.&lt;/li&gt;
&lt;li&gt;Government resources that are outside the realm of Data.gov.sg, e.g. there may be additional datasets not deposited in the central government repository, some that are slightly different, or those with newer updates. For example, &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall.html" target="_blank" rel="noopener"&gt;LTA&amp;rsquo;s DataMall&lt;/a&gt; and &lt;a href="https://www.singstat.gov.sg/find-data/search-by-a-z" target="_blank" rel="noopener"&gt;SingStat&lt;/a&gt; have some additional resources, or datasets that are available on &lt;a href="https://data.gov.sg" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; but they are arranged in various, potentially more appropriate forms (e.g. detailed time series instead of separate datasets). Such resources include several APIs as well.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.openstreetmap.org/" target="_blank" rel="noopener"&gt;OpenStreetMap&lt;/a&gt; &amp;ndash; needless to mention for geospatial data, but surprisingly often overlooked.
OSM appears to have a very high level of quality in Singapore and rapid updates. Its data quality assessment was subject of recent research efforts conducted at our Lab (see &lt;a href="https://ual.sg/post/2020/08/22/assessing-the-quality-of-openstreetmap-building-data-in-singapore/"&gt;here&lt;/a&gt; and &lt;a href="https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/"&gt;here&lt;/a&gt;).&lt;/li&gt;
&lt;li&gt;Data by research groups, companies, community, &amp;hellip;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This list is by no means a complete inventory of open datasets useful for urban analytics covering the city-state.
While there are other instances not mentioned here, these are the datasets we consider useful for our work, have used in our work, or we bookmarked them to consider using them in future.&lt;/p&gt;
&lt;h2 id="the-list"&gt;The List&lt;/h2&gt;
&lt;h3 id="building-and-housing-data"&gt;Building and housing data&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/hdb-property-information" target="_blank" rel="noopener"&gt;HDB Property Information&lt;/a&gt; contains data on each public housing block in Singapore (address, number of flats, year of completion, number of storeys, breakdown by flat type, &amp;hellip;).
It also includes non-residential blocks such as multi-storey carparks.
It does not contain building footprints though.
We used this dataset as one of the input datasets to generate &lt;a href="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/"&gt;3D building models&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Geometric footprints of HDB buildings is available &lt;a href="https://data.gov.sg/collections/2033/view" target="_blank" rel="noopener"&gt;here&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Data on non-HDB buildings (landed houses, condos, commercial buildings&amp;hellip;) is not as complete and it is scattered around, but &lt;a href="https://www.ura.gov.sg/realEstateIIWeb/supply/search.action" target="_blank" rel="noopener"&gt;URA&amp;rsquo;s data portal&lt;/a&gt; is a good starting point for exploration.&lt;/li&gt;
&lt;li&gt;For open data on all building footprints the best bet is OpenStreetMap, it has &lt;a href="https://ual.sg/post/2020/08/22/assessing-the-quality-of-openstreetmap-building-data-in-singapore/"&gt;nearly 100% completeness with rapid updates&lt;/a&gt;, but attribute data may lack.
Data.gov.sg contains &lt;a href="https://data.gov.sg/dataset/master-plan-2014-building" target="_blank" rel="noopener"&gt;a dataset representing building footprints&lt;/a&gt;, but for some reason it is not complete, covering only a subset of buildings several years ago.
It still might be useful though.&lt;/li&gt;
&lt;li&gt;Check out &lt;a href="https://ual.sg/project/roofpedia/"&gt;Roofpedia&lt;/a&gt;, our project that maps solar panels and green roofs on buildings, which includes open data on Singapore, together with several other cities.&lt;/li&gt;
&lt;li&gt;You may also be interested in our project &lt;a href="https://ual.sg/project/gbmi/"&gt;Global Building Morphology Indicators&lt;/a&gt;, which covers SG.&lt;/li&gt;
&lt;/ul&gt;
&lt;figure id="figure-photo-by-贝莉儿-danisthttpsunsplashcomdanist07-on-unsplashhttpsunsplashcomphotosfxealbv2dqu"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Photo by [贝莉儿 DANIST](https://unsplash.com/@danist07) on [Unsplash](https://unsplash.com/photos/FXeaLbv2DQU)." srcset="
/post/2020/06/24/guide-for-open-urban-data-in-singapore/danist-FXeaLbv2DQU-unsplash_hu_2b194f926049bd95.webp 400w,
/post/2020/06/24/guide-for-open-urban-data-in-singapore/danist-FXeaLbv2DQU-unsplash_hu_4aa3854753c1b831.webp 760w,
/post/2020/06/24/guide-for-open-urban-data-in-singapore/danist-FXeaLbv2DQU-unsplash_hu_9e6c323e218b03f9.webp 1200w"
src="https://ual.sg/post/2020/06/24/guide-for-open-urban-data-in-singapore/danist-FXeaLbv2DQU-unsplash_hu_2b194f926049bd95.webp"
width="760"
height="505"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Photo by &lt;a href="https://unsplash.com/@danist07" target="_blank" rel="noopener"&gt;贝莉儿 DANIST&lt;/a&gt; on &lt;a href="https://unsplash.com/photos/FXeaLbv2DQU" target="_blank" rel="noopener"&gt;Unsplash&lt;/a&gt;.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="3d-city-models"&gt;3D city models&lt;/h3&gt;
&lt;p&gt;Unfortunately, 3D city models are not released as open data, except the one &lt;a href="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/"&gt;we generated covering only HDBs&lt;/a&gt;.
The recently &lt;a href="https://www.sla.gov.sg/articles/press-releases/2020/launch-of-onemap3d-beta-at-singapore-geospatial-week-2020" target="_blank" rel="noopener"&gt;released OneMap3D&lt;/a&gt; provides a &lt;a href="https://www.onemap3d.gov.sg/" target="_blank" rel="noopener"&gt;web viewer of the nation-wide 3D city model&lt;/a&gt;, but the data cannot be downloaded, thus, it does not qualify as &lt;a href="https://opendatahandbook.org/guide/en/what-is-open-data/" target="_blank" rel="noopener"&gt;open data&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Worth mentioning is that OpenStreetMap has &lt;a href="https://ual.sg/post/2020/09/12/new-paper-exploration-of-open-data-in-southeast-asia-to-generate-3d-building-models/"&gt;a relatively high level of completeness of building heights and floors&lt;/a&gt;, in comparison to other countries, so in some locations it can be used to generate 3D data.&lt;/p&gt;
&lt;h3 id="real-estate-transactions"&gt;Real estate transactions&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;There is &lt;a href="https://data.gov.sg/dataset/resale-flat-prices" target="_blank" rel="noopener"&gt;a dataset&lt;/a&gt; on resale HDB flats transactions, including the address, storey level, price, remaining lease, etc.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/median-rent-by-town-and-flat-type" target="_blank" rel="noopener"&gt;Median rent by town and flat type (HDB)&lt;/a&gt;, available by quarter since 2005.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.singstat.gov.sg/find-data/search-by-theme/industry/building-real-estate-construction-and-housing/latest-data" target="_blank" rel="noopener"&gt;Data on vacancies&lt;/a&gt; at the SingStat&amp;rsquo;s portal.&lt;/li&gt;
&lt;li&gt;Private residential properties transactions (incl. rentals) are &lt;a href="https://www.ura.gov.sg/realEstateIIWeb/transaction/search.action" target="_blank" rel="noopener"&gt;available through URA&lt;/a&gt;.
The same agency also releases &lt;a href="https://www.ura.gov.sg/Corporate/Property/Property-Data/Commercial-Properties" target="_blank" rel="noopener"&gt;data on commercial properties&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Although not open data, it is worth mentioning that NUS staff and students have access to &lt;a href="https://www.ura.gov.sg/realis" target="_blank" rel="noopener"&gt;more detailed data&lt;/a&gt; through a subscription.&lt;/p&gt;
&lt;h3 id="demographics"&gt;Demographics&lt;/h3&gt;
&lt;p&gt;If you need demographic data, you will probably head to Data.gov.sg, where you will find &lt;a href="https://data.gov.sg/search?groups=society" target="_blank" rel="noopener"&gt;scores of datasets&lt;/a&gt; at different levels (planning area, subzones) and from different years, so it might take time to navigate their landscape.
For example, you may find:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/households-by-monthly-household-income-and-household-size" target="_blank" rel="noopener"&gt;Households by Monthly Household Income and Household Size&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/resident-working-persons-aged-15-years-over-by-planning-area-gross-monthly-income-from-work-2015?resource_id=e4c209d3-4a07-426a-baeb-a7026d09241c" target="_blank" rel="noopener"&gt;Resident Working Persons Aged 15 Years and Over by Planning Area and Gross Monthly Income from Work, 2015&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/resident-population-by-planning-area-subzone-and-type-of-dwelling-2015" target="_blank" rel="noopener"&gt;Resident Population by Planning Area/Subzone and Type of Dwelling, 2015&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/resident-population-by-single-year-of-age-ethnic-group-and-sex-2015" target="_blank" rel="noopener"&gt;Resident Population by Single Year of Age, Ethnic Group and Sex, 2015&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/singapore-residents-by-subzone-and-type-of-dwelling-jun-2017" target="_blank" rel="noopener"&gt;Singapore Residents by Subzone and Type of Dwelling, Jun 2017&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Some of them, like the last example, are available in a geospatial format.&lt;/p&gt;
&lt;p&gt;However, the best place to get demographic data may be through &lt;a href="https://www.singstat.gov.sg/find-data/search-by-theme/population/geographic-distribution/latest-data" target="_blank" rel="noopener"&gt;SingStat&lt;/a&gt;, which lists them for a clear overview and has detailed time series datasets, so you don&amp;rsquo;t have to join multiple datasets.&lt;/p&gt;
&lt;p&gt;Worth mentioning here is also the SLA&amp;rsquo;s &lt;a href="https://docs.onemap.sg" target="_blank" rel="noopener"&gt;OneMap API&lt;/a&gt; that enables retrieving various demographic data on the planning area level.&lt;/p&gt;
&lt;p&gt;Note that most demographic datasets do not include foreigners who are not permanent residents, which represent a sizeable portion of the population.&lt;/p&gt;
&lt;h3 id="energy-consumption"&gt;Energy consumption&lt;/h3&gt;
&lt;p&gt;To the extent of our knowledge, the most granular dataset available is the Data.gov.sg dataset &lt;a href="https://data.gov.sg/dataset/average-monthly-household-electricity-consumption-by-ura-planning-area-and-dwelling-type-2013" target="_blank" rel="noopener"&gt;Average Monthly Household Electricity Consumption by URA Planning Area &amp;amp; Dwelling Type&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="transportation-and-mobility"&gt;Transportation and mobility&lt;/h3&gt;
&lt;p&gt;There are dozens of datasets in this category, mostly acquired and curated by &lt;a href="https://www.lta.gov.sg/" target="_blank" rel="noopener"&gt;LTA&lt;/a&gt;.&lt;/p&gt;
&lt;h4 id="bus-stops-train-stations-and-routes"&gt;Bus stops, train stations, and routes&lt;/h4&gt;
&lt;p&gt;The location of bus stops and train stations is available at multiple locations: &lt;a href="https://openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap&lt;/a&gt;, &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/static-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;, and &lt;a href="https://data.gov.sg" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; (note that there are multiple datasets related to this topic, e.g. train stations as &lt;a href="https://data.gov.sg/dataset/sdcp-mrt-station-point" target="_blank" rel="noopener"&gt;points&lt;/a&gt; and &lt;a href="https://data.gov.sg/dataset/master-plan-2014-rail-station" target="_blank" rel="noopener"&gt;polygons&lt;/a&gt;, there is even one on &lt;a href="https://data.gov.sg/dataset/lta-mrt-station-exit" target="_blank" rel="noopener"&gt;MRT/LRT exits&lt;/a&gt;).
Furthermore, rail lines are available at &lt;a href="https://data.gov.sg/dataset/master-plan-2014-rail-line" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt;, but they can also be extracted from &lt;a href="https://openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Besides data on bus stops, the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt; contains data on bus routes, bus services, and real-time bus arrivals.
You may want to check &lt;a href="https://busrouter.sg" target="_blank" rel="noopener"&gt;BusRouter SG&lt;/a&gt; (together with its sister project &lt;a href="https://railrouter.sg" target="_blank" rel="noopener"&gt;RailRouter SG&lt;/a&gt;) for an awesome web visualisation of this data.
Furthermore, there is a &lt;a href="https://github.com/yinshanyang/singapore-gtfs" target="_blank" rel="noopener"&gt;Github repo&lt;/a&gt; with the data stored according to the &lt;a href="https://en.wikipedia.org/wiki/General_Transit_Feed_Specification" target="_blank" rel="noopener"&gt;General Transit Feed Specification&lt;/a&gt;.&lt;/p&gt;
&lt;h4 id="parking-data"&gt;Parking data&lt;/h4&gt;
&lt;p&gt;Parking data is available in real-time for more than 2000 carparks in Singapore, managed by multiple agencies.
One particularity that may go unnoticed is that there are actually two APIs.
One is offered at the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt; &amp;ndash; it returns detailed availability by carpark, and some information about each such as coordinates.
The second one, linked on the Developer section at &lt;a href="https://data.gov.sg/developer" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; is similar, but it enables querying historical data forgoing some information about the carparks such as location.
We used this dataset in our &lt;a href="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/"&gt;analysis on mobility during the circuit breaker&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;You can join the carpark availability data with the dataset &lt;a href="https://data.gov.sg/dataset/hdb-carpark-information" target="_blank" rel="noopener"&gt;HDB Carpark Information&lt;/a&gt; to get a few more columns not returned by the APIs.
Note that the location of carparks is simply represented as a point, while the &lt;a href="https://services2.hdb.gov.sg/web/fi10/emap.html" target="_blank" rel="noopener"&gt;HDB Map Services&lt;/a&gt; shows them as shapes.
However, the latter is not available for download.&lt;/p&gt;
&lt;h4 id="origin-and-destination-data-and-passenger-volume-by-stationstop"&gt;Origin and destination data, and passenger volume by station/stop&lt;/h4&gt;
&lt;p&gt;The &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt; has a few APIs that enable downloading public transport (bus, train) traffic every month.
For example, it contains the number of passengers that have travelled between two stations, with a breakdown by type of day (weekday/weekend) and hour.
Data is available for the past three months.
Do note that the entire trip is not available; it&amp;rsquo;s limited to the transportation mode.
For example, if a traveller takes a bus to an MRT station and continues the journey with a train, these are considered as separate trips and cannot be connected in the data.&lt;/p&gt;
&lt;p&gt;Regarding MRT/LRT stations, there are two dynamic APIs, both available via the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;.
One returns real-time platform crowdedness level for the MRT/LRT stations of a particular train network line, while the other
provides a forecast for the same at 30 minutes intervals.&lt;/p&gt;
&lt;figure id="figure-photo-by-euan-cameronhttpsunsplashcomeuanacameron-on-unsplashhttpsunsplashcomphotos3es_zsaxj_q"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Photo by [Euan Cameron](https://unsplash.com/@euanacameron) on [Unsplash](https://unsplash.com/photos/3Es_ZsAxj_Q)." srcset="
/post/2020/06/24/guide-for-open-urban-data-in-singapore/euan-cameron-3Es_ZsAxj_Q-unsplash_hu_8aba1b375381075f.webp 400w,
/post/2020/06/24/guide-for-open-urban-data-in-singapore/euan-cameron-3Es_ZsAxj_Q-unsplash_hu_8ea4c880725afd90.webp 760w,
/post/2020/06/24/guide-for-open-urban-data-in-singapore/euan-cameron-3Es_ZsAxj_Q-unsplash_hu_448d88004b4ae317.webp 1200w"
src="https://ual.sg/post/2020/06/24/guide-for-open-urban-data-in-singapore/euan-cameron-3Es_ZsAxj_Q-unsplash_hu_8aba1b375381075f.webp"
width="760"
height="505"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Photo by &lt;a href="https://unsplash.com/@euanacameron" target="_blank" rel="noopener"&gt;Euan Cameron&lt;/a&gt; on &lt;a href="https://unsplash.com/photos/3Es_ZsAxj_Q" target="_blank" rel="noopener"&gt;Unsplash&lt;/a&gt;.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h4 id="current-travel-times-and-speeds"&gt;Current travel times and speeds&lt;/h4&gt;
&lt;p&gt;Another API available thanks to the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt; returns the estimated travel times of expressways.
It might be useful for studying the volume of traffic.
It doesn&amp;rsquo;t look that it enables querying historical data, though.&lt;/p&gt;
&lt;p&gt;A related API, &lt;em&gt;Traffic Speed Bands&lt;/em&gt;, also on the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;, returns current traffic speeds on roads.&lt;/p&gt;
&lt;h4 id="routing"&gt;Routing&lt;/h4&gt;
&lt;p&gt;Routing (fetching the distance, estimated travel time, and the geometry of the route) between two points is available through the &lt;a href="https://docs.onemap.sg" target="_blank" rel="noopener"&gt;OneMap API&lt;/a&gt;.
OpenStreetMap is also useful here, e.g. check out the &lt;a href="http://project-osrm.org" target="_blank" rel="noopener"&gt;Open Source Routing Machine&lt;/a&gt; and &lt;a href="https://openrouteservice.org" target="_blank" rel="noopener"&gt;Openrouteservice&lt;/a&gt;.
There are interfaces for Python and R, e.g. we used &lt;a href="https://cran.r-project.org/web/packages/osrm/index.html" target="_blank" rel="noopener"&gt;osrm&lt;/a&gt; in teaching.&lt;/p&gt;
&lt;p&gt;Although not strictly open, rather commercial (but they offer a free tier), here it is inescapable to mention the trio of APIs under the &lt;a href="https://developers.google.com/maps/documentation" target="_blank" rel="noopener"&gt;Google Maps Platform&lt;/a&gt;: Directions API, Distance Matrix API, and Roads API, which are of high quality and a lot can be done within the free monthly quota they offer.&lt;/p&gt;
&lt;h4 id="taxi-availability"&gt;Taxi availability&lt;/h4&gt;
&lt;p&gt;The availability of taxis is also available on the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;.
The API returns the location of each taxi that is currently available.
The data does not include hired/busy taxis.
Check out the &lt;a href="https://taxirouter.sg" target="_blank" rel="noopener"&gt;TaxiRouter SG&lt;/a&gt;, which visualises this data in real-time, together with the &lt;a href="https://data.gov.sg/dataset/lta-taxi-stop" target="_blank" rel="noopener"&gt;taxi stands&lt;/a&gt;.&lt;/p&gt;
&lt;h4 id="traffic-images"&gt;Traffic images&lt;/h4&gt;
&lt;p&gt;Traffic images are available through the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;.&lt;/p&gt;
&lt;h4 id="transportation-mode"&gt;Transportation mode&lt;/h4&gt;
&lt;p&gt;&lt;a href="https://data.gov.sg/search?q=mode&amp;#43;of&amp;#43;transport" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; contains several datasets on the usual mode of transport used by residents according to surveys.&lt;/p&gt;
&lt;h4 id="mobility-trends"&gt;Mobility trends&lt;/h4&gt;
&lt;p&gt;&lt;a href="https://www.apple.com/covid19/mobility" target="_blank" rel="noopener"&gt;Apple Mobility Trends Reports&lt;/a&gt;, &lt;a href="https://www.google.com/covid19/mobility/" target="_blank" rel="noopener"&gt;Google Community Mobility Reports&lt;/a&gt;, and &lt;a href="https://citymapper.com/cmi" target="_blank" rel="noopener"&gt;CityMapper Mobility Index&lt;/a&gt; all include Singapore.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://citydata.ai" target="_blank" rel="noopener"&gt;CITYDATA&lt;/a&gt; has released &lt;a href="https://univercity.ai/mobility-trip-patterns-for-singapore/" target="_blank" rel="noopener"&gt;Singapore mobility trip patterns as open data&lt;/a&gt;.&lt;/p&gt;
&lt;h4 id="assorted"&gt;Assorted&lt;/h4&gt;
&lt;p&gt;Both the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt; and &lt;a href="https://www.singstat.gov.sg/find-data/search-by-theme/industry/transport/latest-data" target="_blank" rel="noopener"&gt;SingStat&lt;/a&gt; have more datasets worth having a look at, e.g. number of cars in SG at a fine temporal scale (updated monthly).&lt;/p&gt;
&lt;h3 id="map--geospatial-data-general"&gt;Map / Geospatial data (general)&lt;/h3&gt;
&lt;p&gt;Besides OpenStreetMap which is &lt;a href="https://ual.sg/post/2020/08/22/assessing-the-quality-of-openstreetmap-building-data-in-singapore/"&gt;quite complete and of high quality for a wide range of features&lt;/a&gt;, well worth mentioning is the Geospatial Whole Island dataset available through the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;.
It contains a bunch of different features related to transportation, e.g. road crossings, traffic lights, taxi stands, and cycling paths.&lt;/p&gt;
&lt;p&gt;Further, &lt;a href="https://data.gov.sg" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt; contains some datasets such as the &lt;a href="https://data.gov.sg/dataset/master-plan-2019-subzone-boundary-no-sea" target="_blank" rel="noopener"&gt;boundaries of administrative areas&lt;/a&gt;, &lt;a href="https://data.gov.sg/dataset/master-plan-2019-land-use-layer" target="_blank" rel="noopener"&gt;master plan land use&lt;/a&gt; (containing the &lt;a href="https://www.ura.gov.sg/Corporate/Guidelines/Development-Control/Non-Residential/Commercial/Gross-Plot-Ratio" target="_blank" rel="noopener"&gt;Gross Plot Ratio&lt;/a&gt;), and &lt;a href="https://data.gov.sg/dataset/sla-cadastral-land-parcel" target="_blank" rel="noopener"&gt;cadastral land parcels&lt;/a&gt;.
The &lt;a href="https://data.gov.sg/dataset?q=&amp;amp;organization=national-parks-board" target="_blank" rel="noopener"&gt;series of datasets by NParks&lt;/a&gt; hosted on Data.gov.sg deserves special attention: it covers a wide range of &lt;a href="https://data.gov.sg/dataset/parks" target="_blank" rel="noopener"&gt;park&lt;/a&gt;-related features under their purview, e.g. &lt;a href="https://data.gov.sg/dataset/nparks-activity-area" target="_blank" rel="noopener"&gt;boundaries of activity areas&lt;/a&gt;, &lt;a href="https://data.gov.sg/dataset/nparks-playfitness-equipment" target="_blank" rel="noopener"&gt;locations of play/fitness equipment&lt;/a&gt;, &lt;a href="https://data.gov.sg/dataset/nparks-bbq-pits" target="_blank" rel="noopener"&gt;bbq pits&lt;/a&gt;, &lt;a href="https://data.gov.sg/dataset/park-connector-loop" target="_blank" rel="noopener"&gt;the shape of the park connector loop&lt;/a&gt;, and &lt;a href="https://data.gov.sg/dataset/nparks-car-park-lots" target="_blank" rel="noopener"&gt;carpark lots&lt;/a&gt; (however, do note that the NParks&amp;rsquo; carparks do not appear to be covered by the LTA&amp;rsquo;s API mentioned above).&lt;/p&gt;
&lt;p&gt;For trees, check out &lt;a href="https://github.com/cheeaun/exploretrees-sg" target="_blank" rel="noopener"&gt;ExploreTrees.SG&lt;/a&gt;, derived from &lt;a href="http://trees.sg" target="_blank" rel="noopener"&gt;Trees.SG&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Finally, you may be interested in the &lt;a href="https://doi.org/10.6084/m9.figshare.8267510" target="_blank" rel="noopener"&gt;high-resolution map of Singapore’s terrestrial ecosystems&lt;/a&gt; that was developed by the research team of the &lt;a href="http://www.naturalcapital.sg" target="_blank" rel="noopener"&gt;Natural Capital Singapore&lt;/a&gt; and released as open data.
There is also a &lt;a href="https://doi.org/10.3390/data4030116" target="_blank" rel="noopener"&gt;paper&lt;/a&gt; published.&lt;/p&gt;
&lt;h3 id="aerial-imagery"&gt;Aerial imagery&lt;/h3&gt;
&lt;p&gt;There are no open data high-resolution resources we are aware of.
Satellite imagery is available for academia through the &lt;a href="https://www.planet.com/markets/education-and-research/" target="_blank" rel="noopener"&gt;Planet&amp;rsquo;s Education and Research Programme&lt;/a&gt;, which we are a member of and which is accessible to other academics as well.&lt;/p&gt;
&lt;h3 id="point-clouds-lidar-terrain-data"&gt;Point clouds (LiDAR), terrain data&lt;/h3&gt;
&lt;p&gt;Pretty much none, except terrain data of coarse resolution such as &lt;a href="https://www2.jpl.nasa.gov/srtm/" target="_blank" rel="noopener"&gt;SRTM&lt;/a&gt;.
Please see also &lt;a href="https://ugl.sg/2022/10/03/dem-of-singapore-srtm/" target="_blank" rel="noopener"&gt;this page&lt;/a&gt; from the &lt;a href="https://ugl.sg" target="_blank" rel="noopener"&gt;NUS Urban Green Lab&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="street-level-imagery"&gt;Street-level imagery&lt;/h3&gt;
&lt;p&gt;Google Street View has pretty good coverage of Singapore (it even includes &lt;a href="https://www.channelnewsasia.com/news/singapore/google-hawker-centres-stalls-street-view-maps-trekkers-11764968" target="_blank" rel="noopener"&gt;hawker centres&lt;/a&gt;), and the data is downloadable through their &lt;a href="https://developers.google.com/maps/documentation/streetview/intro" target="_blank" rel="noopener"&gt;API&lt;/a&gt; (check the T&amp;amp;C though).
&lt;a href="https://www.mapillary.com" target="_blank" rel="noopener"&gt;Mapillary&lt;/a&gt; and &lt;a href="https://kartaview.org/" target="_blank" rel="noopener"&gt;KartaView&lt;/a&gt; are also worth considering.&lt;/p&gt;
&lt;h3 id="airbnb"&gt;Airbnb&lt;/h3&gt;
&lt;p&gt;&lt;a href="http://insideairbnb.com" target="_blank" rel="noopener"&gt;Inside Airbnb&lt;/a&gt; has Airbnb data on Singapore, updated monthly.
It includes listings and their reviews.&lt;/p&gt;
&lt;h3 id="other"&gt;Other&lt;/h3&gt;
&lt;p&gt;There are some datasets, which albeit we have not used much so far, are worth mentioning and keeping in mind.
The honourable mentions are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Weather data, which is available through &lt;a href="https://data.gov.sg/developer" target="_blank" rel="noopener"&gt;Data.gov.sg&lt;/a&gt;: real-time weather readings, Pollutant Standards Index (PSI), Ultra-violet Index (UVI), etc.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/eating-establishments" target="_blank" rel="noopener"&gt;Eating establishments by NEA&lt;/a&gt; &amp;ndash; quite comprehensive dataset on all places allowed to sell food in Singapore.
You may also be interested in &lt;a href="https://data.gov.sg/dataset/hawker-centres" target="_blank" rel="noopener"&gt;data on hawker centres&lt;/a&gt; and &lt;a href="https://data.gov.sg/dataset/supermarkets" target="_blank" rel="noopener"&gt;supermarkets&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/nparks-skyrise-greenery" target="_blank" rel="noopener"&gt;Skyrise Greenery dataset&lt;/a&gt; shows the indicative location of rooftop and vertical greenery.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="notes-and-considerations"&gt;Notes and considerations&lt;/h2&gt;
&lt;h3 id="tabular-data--geocoding"&gt;Tabular data / geocoding&lt;/h3&gt;
&lt;p&gt;While much of the data represents &lt;em&gt;something that happens somewhere&lt;/em&gt; (e.g. real estate transactions), many datasets are not available in a GIS format.
They are rather released as &lt;a href="https://en.wikipedia.org/wiki/Comma-separated_values" target="_blank" rel="noopener"&gt;CSVs&lt;/a&gt; (e.g. real estate transaction datasets contain an address representing each transaction, but not the coordinates nor the dataset is in a geo-format).
To convert (geocode) the address into coordinates, may we suggest to use the &lt;a href="https://docs.onemap.sg" target="_blank" rel="noopener"&gt;OneMap API&lt;/a&gt;, &lt;a href="https://nominatim.org" target="_blank" rel="noopener"&gt;Nominatim&lt;/a&gt;, or &lt;a href="https://cloud.google.com/maps-platform/" target="_blank" rel="noopener"&gt;Google Maps API&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="web-services"&gt;Web services&lt;/h3&gt;
&lt;p&gt;There are a few web services containing various interesting datasets (e.g. &lt;a href="https://www.onemap.sg/main/v2/" target="_blank" rel="noopener"&gt;OneMap&lt;/a&gt;, &lt;a href="https://www.onemap3d.gov.sg/" target="_blank" rel="noopener"&gt;OneMap3D&lt;/a&gt;, &lt;a href="https://services2.hdb.gov.sg/web/fi10/emap.html" target="_blank" rel="noopener"&gt;HDB Map Services&lt;/a&gt;, &lt;a href="https://www.ura.gov.sg/maps/" target="_blank" rel="noopener"&gt;URA SPACE&lt;/a&gt;, &lt;a href="http://trees.sg" target="_blank" rel="noopener"&gt;Trees.sg&lt;/a&gt;), but not all of them can be downloaded, so they are not considered as &lt;a href="https://opendatahandbook.org/guide/en/what-is-open-data/" target="_blank" rel="noopener"&gt;open data&lt;/a&gt;.
Nevertheless, they may still be useful for viewing.&lt;/p&gt;
&lt;h3 id="social-media"&gt;Social media&lt;/h3&gt;
&lt;p&gt;The &lt;a href="https://developer.twitter.com/en/docs" target="_blank" rel="noopener"&gt;Twitter API&lt;/a&gt; enables downloading their data for Singapore, but given that the social network is not very popular here, and the data comes with restrictions (so it is technically not open data), its functionality is not that great.&lt;/p&gt;
&lt;h3 id="licence-validity-and-quality-of-data"&gt;Licence, validity and quality of data&lt;/h3&gt;
&lt;p&gt;The usual caveats:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Check when the dataset has been updated. Some datasets are not updated, a new dataset is released instead as a new instance, not superseding the old one.&lt;/li&gt;
&lt;li&gt;Check the licence, e.g. for Data.gov.sg have a look at the &lt;a href="https://data.gov.sg/open-data-licence" target="_blank" rel="noopener"&gt;Singapore Open Data Licence&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Do not forget to attribute the data source in your use and mention the year when it was created/updated.&lt;/li&gt;
&lt;li&gt;Some geospatial datasets may not pass all validity checks (e.g. they might have self-intersecting polygons), presenting a problem when they are used in spatial analyses.
You can try fixing them using &lt;a href="https://github.com/tudelft3d/prepair" target="_blank" rel="noopener"&gt;prepair&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="further-reading"&gt;Further reading&lt;/h3&gt;
&lt;p&gt;You might also want to check out &lt;a href="https://nusgis.org/data/" target="_blank" rel="noopener"&gt;this page&lt;/a&gt; by NUS Geography collaborator &lt;a href="https://discovery.nus.edu.sg/19079-yingwei-yan" target="_blank" rel="noopener"&gt;Yingwei Yan&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="have-a-suggestion-for-an-entry-spotted-an-error"&gt;Have a suggestion for an entry? Spotted an error?&lt;/h3&gt;
&lt;p&gt;&lt;a href="https://ual.sg/#contact"&gt;Get in touch&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Keynote at IACAD 2020</title><link>https://ual.sg/post/2020/06/08/keynote-at-iacad-2020/</link><pubDate>Mon, 08 Jun 2020 09:39:03 +0800</pubDate><guid>https://ual.sg/post/2020/06/08/keynote-at-iacad-2020/</guid><description>&lt;p&gt;The &lt;a href="https://casugol.com/iacad/" target="_blank" rel="noopener"&gt;International Academic Conference on Architecture and Design (IACAD) 2020&lt;/a&gt; has been held last week.
The event featured presentations by the leading members of the academia, researchers, and practitioners addressing critical issues and future trends in the field of Architecture and Design.&lt;/p&gt;
&lt;p&gt;We participated as well: &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; has been invited as keynote speaker on the topic &lt;em&gt;Status of volunteered geospatial 3D data&lt;/em&gt;.
The talk was focused on the potential of OpenStreetMap for generating 3D city models, and showcasing preliminary results of our project on developing a method to generate 3D city models using geographic data science and machine learning.&lt;/p&gt;
&lt;p&gt;Thanks &lt;a href="https://casugol.com" target="_blank" rel="noopener"&gt;CASUGOL&lt;/a&gt; for organising the event and for the invitation.&lt;/p&gt;</description></item><item><title>Leveraging cloud technology for GIS analysis with OpenStreetMap</title><link>https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/</link><pubDate>Mon, 27 Apr 2020 12:55:01 +0800</pubDate><guid>https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/</guid><description>&lt;p&gt;With &lt;a href="https://www.openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap&lt;/a&gt; being one of the most widespread and most frequently updated open geospatial datasets, it is has become popular among map enthusiasts and GIS professionals. Due to its high volume and frequent updates, cloud technology could be leveraged to provide an efficient environment that is easily accessible and maintained.&lt;/p&gt;
&lt;p&gt;Being a major player in cloud platform, &lt;a href="https://registry.opendata.aws/" target="_blank" rel="noopener"&gt;Amazon Web Services (AWS) hosts a registry of publicly available datasets&lt;/a&gt; &lt;sup id="fnref:1"&gt;&lt;a href="#fn:1" class="footnote-ref" role="doc-noteref"&gt;1&lt;/a&gt;&lt;/sup&gt; accessible via its resources. Among the datasets relevant to GIS analysis, &lt;a href="https://registry.opendata.aws/osm/" target="_blank" rel="noopener"&gt;OpenStreetMap planet data&lt;/a&gt; was one of them. &lt;a href="https://console.aws.amazon.com/athena/home" target="_blank" rel="noopener"&gt;Amazon Athena&lt;/a&gt; &lt;sup id="fnref:2"&gt;&lt;a href="#fn:2" class="footnote-ref" role="doc-noteref"&gt;2&lt;/a&gt;&lt;/sup&gt; service makes the OpenStreetMap planet data easily accessible for query, analysis and export.&lt;/p&gt;
&lt;p&gt;This post will detail the steps to set up the OpenStreetMap planet table on Athena, mention a couple of useful features of Athena and demonstrate the automation of the weekly refresh of the latest OpenStreetMap planet data made available on AWS.&lt;/p&gt;
&lt;h3 id="pre-requisites"&gt;Pre-requisites&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Have an AWS account, with necessary permissions to AWS S3 and Athena&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="setting-up-planet-table-on-athena"&gt;Setting up planet table on Athena&lt;/h3&gt;
&lt;p&gt;When using Athena for the first time, there will be a warning that requires an AWS S3 location for storing query results. If there isn&amp;rsquo;t a designated bucket created already, do so; else we just click on the warning for a form to specify the bucket location.&lt;/p&gt;
&lt;figure id="figure-set-up-s3-bucket-for-query-output"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Set up S3 Bucket for Query Output" srcset="
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/s3-bucket-warning_hu_83f332bec487f76d.webp 400w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/s3-bucket-warning_hu_50797db6dea963cc.webp 760w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/s3-bucket-warning_hu_9cbab764190b6528.webp 1200w"
src="https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/s3-bucket-warning_hu_83f332bec487f76d.webp"
width="760"
height="467"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Set up S3 Bucket for Query Output
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Using the query editor, we create the &lt;code&gt;default&lt;/code&gt; database using the SQL query below.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-sql" data-lang="sql"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;DATABASE&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;IF&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;NOT&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;EXISTS&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;`&lt;/span&gt;&lt;span class="k"&gt;default&lt;/span&gt;&lt;span class="o"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Then we will create the &lt;code&gt;planet&lt;/code&gt; table with the following query:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-sql" data-lang="sql"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;EXTERNAL&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;TABLE&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;planet&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;BIGINT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;STRING&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;tags&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;MAP&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;STRING&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;STRING&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;lat&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;9&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="c1"&gt;-- for nodes
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;lon&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="c1"&gt;-- for nodes
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;nds&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;ARRAY&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;STRUCT&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="k"&gt;ref&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;BIGINT&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="c1"&gt;-- for ways
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;members&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;ARRAY&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;STRUCT&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;STRING&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;ref&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;BIGINT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;STRING&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="c1"&gt;-- for relations
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;changeset&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;BIGINT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;timestamp&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;TIMESTAMP&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;uid&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;BIGINT&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;user&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;STRING&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;version&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;BIGINT&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;STORED&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;AS&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;ORCFILE&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="k"&gt;LOCATION&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;s3://osm-pds/planet/&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;After this, we can do a quick preview of the table to check on the imported data.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-sql" data-lang="sql"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;FROM&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;planet&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;LIMIT&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;figure id="figure-preview-table"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Preview Table" srcset="
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/featured_hu_3f190fe4d24e8a67.webp 400w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/featured_hu_81541a9f995841e4.webp 760w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/featured_hu_ee592a23ab6f8ee6.webp 1200w"
src="https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/featured_hu_3f190fe4d24e8a67.webp"
width="760"
height="362"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Preview Table
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;h3 id="useful-features-of-athena"&gt;Useful Features of Athena&lt;/h3&gt;
&lt;p&gt;Athena offers some useful features such as Saved Queries, where we could save queries we often use.&lt;/p&gt;
&lt;figure id="figure-save-frequently-used-query"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Save Frequently Used Query" srcset="
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/save-query_hu_f37b32fcfaf02c5b.webp 400w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/save-query_hu_d539957a24e423c4.webp 760w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/save-query_hu_f5875745a00c4ba6.webp 1200w"
src="https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/save-query_hu_f37b32fcfaf02c5b.webp"
width="760"
height="396"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Save Frequently Used Query
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Also, the Query History shows all prior query attempts and their outcome. This is particularly useful to get an idea of the volume of data we have queried.&lt;/p&gt;
&lt;figure id="figure-query-history-for-usage-review"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Query History for Usage Review" srcset="
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/query-history_hu_6d3868fc48977397.webp 400w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/query-history_hu_29364d6298aaa6f5.webp 760w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/query-history_hu_f73ae84f64a01c4a.webp 1200w"
src="https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/query-history_hu_6d3868fc48977397.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Query History for Usage Review
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Very often, it is useful saved the query output as a table or view, or export to be downloaded. The supported export formats are as shown below:&lt;/p&gt;
&lt;figure id="figure-anthena-supports-multiple-export-formats"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Anthena Supports Multiple Export Formats" srcset="
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/supported-export-format_hu_e2ba68b270848621.webp 400w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/supported-export-format_hu_2ea1fdea28c62d94.webp 760w,
/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/supported-export-format_hu_697ede7de811aa0c.webp 1200w"
src="https://ual.sg/post/2020/04/27/leveraging-cloud-technology-for-gis-analysis-with-openstreetmap/supported-export-format_hu_e2ba68b270848621.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Anthena Supports Multiple Export Formats
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The output files are available in the S3 bucket configured earlier.&lt;/p&gt;
&lt;p&gt;As this post is being written, AWS Athena export supports export format in &lt;code&gt;Parquet&lt;/code&gt;, &lt;code&gt;ORC&lt;/code&gt;, &lt;code&gt;AVRO&lt;/code&gt;, &lt;code&gt;CSV&lt;/code&gt;, &lt;code&gt;JSON&lt;/code&gt; and &lt;code&gt;TSV&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;When deciding the output format, it is useful to keep in mind these considerations&lt;sup id="fnref:3"&gt;&lt;a href="#fn:3" class="footnote-ref" role="doc-noteref"&gt;3&lt;/a&gt;&lt;/sup&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Read/Write Intensive &amp;amp; Query Pattern&lt;/li&gt;
&lt;li&gt;Compression&lt;/li&gt;
&lt;li&gt;Schema Evaluation&lt;/li&gt;
&lt;li&gt;Nested Column&lt;/li&gt;
&lt;li&gt;Platform&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="automate-refresh-of-planet-data"&gt;Automate refresh of planet data&lt;/h3&gt;
&lt;p&gt;Not only the set up of OpenStreetMap planet on Athena is easy, its maintenance and periodic refresh of the latest data can also be automated with a simple script.&lt;/p&gt;
&lt;p&gt;Although some OpenStreetMap sources on some websites are updated daily, the OpenStreetMap dataset on AWS is only updated once a week. When that happens, we want to update the planet table for the latest version. Instead of performing the update manually, we can automate the boring stuff, set up &lt;a href="https://crontab.guru/" target="_blank" rel="noopener"&gt;crontab&lt;/a&gt; to run the following script to perform the update.&lt;/p&gt;
&lt;p&gt;The script requires
&lt;a href="https://aws.amazon.com/sdk-for-python/" target="_blank" rel="noopener"&gt;AWS SDK for Python (Boto3)&lt;/a&gt;,
&lt;a href="https://boto3.amazonaws.com/v1/documentation/api/latest/index.html" target="_blank" rel="noopener"&gt;AWS Boto3&lt;/a&gt;
and &lt;a href="https://github.com/laughingman7743/PyAthena/" target="_blank" rel="noopener"&gt;PyAthena&lt;/a&gt;.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-python3" data-lang="python3"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;pyathena&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;connect&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;cursor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;connect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;aws_access_key_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;XXXXX&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="n"&gt;aws_secret_access_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;XXXXX&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="n"&gt;s3_staging_dir&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;s3://bucket-for-staging/&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="n"&gt;region_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;ap-southeast-1&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;create_db_sql&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;&amp;#34;&amp;#34;&amp;#34;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; CREATE DATABASE IF NOT EXISTS default;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt;&amp;#34;&amp;#34;&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;drop_table_sql&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;&amp;#34;&amp;#34;&amp;#34;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; DROP TABLE IF EXISTS planet;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt;&amp;#34;&amp;#34;&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;refresh_planet_sql&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;&amp;#34;&amp;#34;&amp;#34;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; CREATE EXTERNAL TABLE planet (
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; id BIGINT,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; type STRING,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; tags MAP&amp;lt;STRING,STRING&amp;gt;,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; lat DECIMAL(9,7),
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; lon DECIMAL(10,7),
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; nds ARRAY&amp;lt;STRUCT&amp;lt;ref: BIGINT&amp;gt;&amp;gt;,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; members ARRAY&amp;lt;STRUCT&amp;lt;type: STRING, ref: BIGINT, role: STRING&amp;gt;&amp;gt;,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; changeset BIGINT,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; timestamp TIMESTAMP,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; uid BIGINT,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; user STRING,
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; version BIGINT
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; )
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; STORED AS ORCFILE
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt; LOCATION &amp;#39;s3://osm-pds/planet/&amp;#39;;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="s2"&gt;&amp;#34;&amp;#34;&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;create_db_sql&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;drop_table_sql&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;refresh_planet_sql&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="nb"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;&amp;#34;Table `planet` is updated.&amp;#34;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="ne"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="nb"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nb"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h3 id="other-considerations"&gt;Other Considerations&lt;/h3&gt;
&lt;p&gt;Working with cloud technology often means having a 3rd party hosting our private data and/or intellectual properties. We should be mindful to choose the region/country where these properties are judicially bound.&lt;/p&gt;
&lt;p&gt;Last but not least, it is also important to follow security best practices, such as enforcing secure and complex passwords, assigning only the minimum permissions required, enabling multi-factor authentication, to ensure our private data and intellectual properties are protected.&lt;/p&gt;
&lt;div class="footnotes" role="doc-endnotes"&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id="fn:1"&gt;
&lt;p&gt;&lt;a href="https://docs.opendata.aws/osm-pds/readme.html" target="_blank" rel="noopener"&gt;OpenStreetMap on AWS&lt;/a&gt;&amp;#160;&lt;a href="#fnref:1" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:2"&gt;
&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/athena/index.html" target="_blank" rel="noopener"&gt;AWS Athena Documentation&lt;/a&gt;&amp;#160;&lt;a href="#fnref:2" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:3"&gt;
&lt;p&gt;&lt;a href="https://towardsdatascience.com/demystify-hadoop-data-formats-avro-orc-and-parquet-e428709cf3bb" target="_blank" rel="noopener"&gt;Demystifying Hadoop Data Format&lt;/a&gt;&amp;#160;&lt;a href="#fnref:3" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;</description></item><item><title>Updated plots on the current mobility situation around the world</title><link>https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/</link><pubDate>Fri, 24 Apr 2020 08:50:53 +0800</pubDate><guid>https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/</guid><description>&lt;p&gt;About two weeks ago, we published
&lt;a href="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/"&gt;an article&lt;/a&gt; on how Singapore has responded to the new stringent preventive measures against COVID-19, including a comparison with dozens of other cities around the world.&lt;/p&gt;
&lt;p&gt;We have used data from &lt;a href="https://citymapper.com" target="_blank" rel="noopener"&gt;Citymapper&lt;/a&gt;, i.e. the &lt;a href="https://citymapper.com/cmi" target="_blank" rel="noopener"&gt;Citymapper Mobility Index&lt;/a&gt;, which indicates the daily change of the city moving compared to the usual trends (the index is calculated by comparing trips planned in the Citymapper app to a recent typical usage period).
Considering that the dataset is continuously updated, it is also time to update some of the plots.&lt;/p&gt;
&lt;p&gt;The first one suggests that the intensity of transit remained low in Singapore after it plunged following the introduction of the &lt;a href="https://www.gov.sg/article/what-you-can-and-cannot-do-during-the-circuit-breaker-period" target="_blank" rel="noopener"&gt;&lt;em&gt;circuit breaker&lt;/em&gt;&lt;/a&gt;.
Still, it has further decreased by a few percents (this will also become visible in the final visual in this article).&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_hu_42c70cebcf7c92c0.webp 400w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_hu_74c34eb685808c25.webp 760w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_hu_3eb3654ec28c57fc.webp 1200w"
src="https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_hu_42c70cebcf7c92c0.webp"
width="760"
height="447"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;Transit in Singapore is now around the world average.
Seoul and Hong Kong remained the least affected in the sample of 41 cities.
Nevertheless, their transit volume is at about a third than the usual levels, which is still an enormous drop.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_hu_9cd0296ad4ba8599.webp 400w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_hu_de13a21f8e495ba.webp 760w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_hu_782dfdda35025426.webp 1200w"
src="https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_hu_9cd0296ad4ba8599.webp"
width="760"
height="581"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;Another look at the same dataset, with each city illustrated separately:&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_panels_hu_e1874e2a0d3165af.webp 400w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_panels_hu_bd0fcba6df8f9bae.webp 760w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_panels_hu_fd345866be379c2a.webp 1200w"
src="https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_multi-city_panels_hu_e1874e2a0d3165af.webp"
width="670"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;And finally, the change in the past two weeks is visualised, which may be a good indicator of the recent events, such as the introduction or relaxation of measures.
The plot compares the situation on Wednesday 8 April (second day of the new stringent measures in Singapore) with the one on Wednesday 22 April 2020.
This time (in comparison with &lt;a href="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/"&gt;the last version of the same plot&lt;/a&gt;), no city covered in the dataset has experienced a radical change.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_dumbbell_hu_645099f3654d5b9.webp 400w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_dumbbell_hu_b8f333628beae6f1.webp 760w,
/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_dumbbell_hu_251f6ed0c0278c86.webp 1200w"
src="https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/cmi-2020-04-23_dumbbell_hu_645099f3654d5b9.webp"
width="670"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;h3 id="acknowledgements"&gt;Acknowledgements&lt;/h3&gt;
&lt;p&gt;Thanks to &lt;a href="https://citymapper.com/cmi" target="_blank" rel="noopener"&gt;Citymapper&lt;/a&gt; for releasing and maintaining this dataset.&lt;/p&gt;
&lt;h3 id="alternative-data"&gt;Alternative data&lt;/h3&gt;
&lt;p&gt;In the meantime, Apple &lt;a href="https://www.apple.com/sg/newsroom/2020/04/apple-makes-mobility-data-available-to-aid-covid-19-efforts/" target="_blank" rel="noopener"&gt;made their aggregated navigation data from Maps available&lt;/a&gt; for &lt;a href="https://www.apple.com/covid19/mobility/" target="_blank" rel="noopener"&gt;download&lt;/a&gt;.
It also includes Singapore, and it might be analysed in some future article.&lt;/p&gt;</description></item><item><title>Welcome, Yoong Shin</title><link>https://ual.sg/post/2020/04/23/welcome-yoong-shin/</link><pubDate>Thu, 23 Apr 2020 09:39:03 +0800</pubDate><guid>https://ual.sg/post/2020/04/23/welcome-yoong-shin/</guid><description>&lt;p&gt;We welcome &lt;a href="https://ual.sg/author/yoong-shin-chow/"&gt;Yoong Shin Chow&lt;/a&gt; as new full-time researcher in the group.&lt;/p&gt;
&lt;p&gt;She will be working on a new project on developing a method to generate 3D city models using geographic data science and machine learning.
The work aspires to create 3D city models of cities that do not have 3D data yet, which would open the door for environmental analyses and other urban analytics purposes.&lt;/p&gt;
&lt;p&gt;Yoong Shin has studied in the United States, and has a rich experience (15+ years) in the IT industry in different roles.
She is also active in the local Python community &amp;ndash; &lt;a href="https://pycon.sg" target="_blank" rel="noopener"&gt;PyCon SG&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Welcome, Yoong Shin!&lt;/p&gt;</description></item><item><title>Singapore's urban data affirms the compliance with the Circuit Breaker measures</title><link>https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/</link><pubDate>Sun, 12 Apr 2020 21:39:03 +0800</pubDate><guid>https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/</guid><description>&lt;div class="alert alert-note"&gt;
&lt;div&gt;
Some of the plots in this article have been updated two weeks later in a new &lt;a href="https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/" target="_blank" rel="noopener"&gt;post&lt;/a&gt; using the latest available data.
&lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;The news and social media have myriads of visualisations about not only the proliferation of the COVID-19 pandemic but also on how the measures to mitigate it have affected numerous aspects of daily life and the economy.
These aid us to understand whether social distancing is taken seriously.
The analyses presented focus mostly on the US, China, and Europe; and I have not seen much attention on Singapore, with the notable exception&lt;sup id="fnref:1"&gt;&lt;a href="#fn:1" class="footnote-ref" role="doc-noteref"&gt;1&lt;/a&gt;&lt;/sup&gt; of &lt;a href="https://www.google.com/covid19/mobility/" target="_blank" rel="noopener"&gt;Google&amp;rsquo;s Community Mobility Reports&lt;/a&gt;, and a couple of earlier comparisons with other cities relying on commercial data&lt;sup id="fnref:2"&gt;&lt;a href="#fn:2" class="footnote-ref" role="doc-noteref"&gt;2&lt;/a&gt;&lt;/sup&gt;.
Here is an excerpt from Google&amp;rsquo;s latest report indicating the response to social distancing guidance:&lt;/p&gt;
&lt;figure id="figure-snippet-of-the-googles-community-mobility-reports-for-singapore-released-on-5-apr-2020-just-before-the-new-stringent-measures"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Snippet of the Google&amp;#39;s Community Mobility Reports for Singapore, released on 5 Apr 2020, just before the new stringent measures." srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/Google-CMR_hu_b70922241de8ed3f.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/Google-CMR_hu_e3e559c591077d5c.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/Google-CMR_hu_c338700d2792ae20.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/Google-CMR_hu_b70922241de8ed3f.webp"
width="760"
height="172"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Snippet of the Google&amp;rsquo;s Community Mobility Reports for Singapore, released on 5 Apr 2020, just before the new stringent measures.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;Now that we have concluded the first week of the &lt;a href="https://www.gov.sg/article/what-you-can-and-cannot-do-during-the-circuit-breaker-period" target="_blank" rel="noopener"&gt;new stringent preventive measures&lt;/a&gt; (&lt;em&gt;circuit breaker&lt;/em&gt;), I have analysed a few datasets and produced a couple of visuals to illustrate the current situation in Singapore with social distancing (a few words about the data and method are at the &lt;a href="#method"&gt;end of the article&lt;/a&gt;), and how Singapore compares to other cities around the world.&lt;/p&gt;
&lt;p&gt;One of the most apparent urban aspects to look at is movement, as activities have been curtailed, and may directly suggest the compliance with the measures and social distancing guidelines.&lt;/p&gt;
&lt;h3 id="car-traffic-now-before-and-before-then"&gt;Car traffic now, before, and&amp;hellip; before then&lt;/h3&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/carpark_availability_panels_hu_52f0ea24d62e8c46.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/carpark_availability_panels_hu_f2fbb6712105d6bd.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/carpark_availability_panels_hu_29c86bba5921bf57.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/carpark_availability_panels_hu_52f0ea24d62e8c46.webp"
width="760"
height="581"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;The first plot indicates the intensity of car traffic in Singapore in detail (by day and hour) over three periods: the first week of the circuit breaker, last week, and the same period the previous year.
The data is derived from HDB carpark availability as a proxy for traffic (more about the data and rationale for this approach is given &lt;a href="#method"&gt;later&lt;/a&gt;).
While it is interesting per se to see daily and hourly patterns during the ordinary times, it indicates a considerable drop during the first week of the CB measures.
Unsurprisingly, the traffic plunged on the first day of the circuit breaker (Tuesday 7 Apr), and it is at the levels below a typical Sunday when the traffic is usually at its lowest point.
Friday was the first public holiday during the circuit breaker, rendering a very notable trace.&lt;/p&gt;
&lt;p&gt;The plot suggests that the measures are taken seriously.
Moreover, the downward slope was already evident the week before the CB, as more people were taking precautions, e.g. working from home following &lt;a href="https://www.mom.gov.sg/covid-19/advisory-on-safe-distancing-measures" target="_blank" rel="noopener"&gt;the advisory of the Ministry of Manpower&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The next graphic similarly echoes the above analysis but from a different perspective, enabling an easier comparative view on the difference of the impact between the initial measures and the circuit breaker.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/weekdays_taken_hu_b37838f6a343378d.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/weekdays_taken_hu_d62ddf1e7a9e11d6.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/weekdays_taken_hu_558ad18e285c7f88.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/weekdays_taken_hu_b37838f6a343378d.webp"
width="760"
height="431"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;While Friday is not comparable to the previous Fridays, as it was a public holiday (Good Friday), an interesting insight is that it suggests that rarely anyone left their residence then, at least not by car (the number of cars parked during the day remained at almost the same level as the preceding night).&lt;/p&gt;
&lt;h3 id="leading-to-where-we-are-now"&gt;Leading to where we are now&lt;/h3&gt;
&lt;p&gt;The visual below includes a longer longitudinal daily trend since early December, before the pandemic started, illustrating the traffic until now and covering some key events in the timeline as the events unfolded.
I also marked public holidays, when traffic drops regardless of the current events, to exclude these from consideration.&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cal_plot_hu_dc7b32671b4962c4.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cal_plot_hu_71df22132d7f0663.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cal_plot_hu_747018b3577bffe3.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cal_plot_hu_dc7b32671b4962c4.webp"
width="760"
height="523"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;On 23 January the first case in Singapore was confirmed.
The subsequent day the traffic plunged, but very likely due to the CNY Eve.
Days in the following week were also probably affected by the festivities and more people being on leave, rather than the pandemic, in its early stage at that time.&lt;/p&gt;
&lt;p&gt;On Friday 7 February, the Government raised the nation&amp;rsquo;s Disease Outbreak Response System Condition (DORSCON) level.&lt;/p&gt;
&lt;p&gt;On Friday 13 March, the Ministry of Manpower issued &lt;a href="https://www.mom.gov.sg/covid-19/advisory-on-safe-distancing-measures" target="_blank" rel="noopener"&gt;an advisory&lt;/a&gt;, which states &lt;em&gt;Where employees can perform their work by telecommuting from home, the employer must ensure that they do so&lt;/em&gt;.
It appears that mid-next workweek the traffic visibly dropped.&lt;/p&gt;
&lt;h3 id="what-about-public-transportation"&gt;What about public transportation&lt;/h3&gt;
&lt;p&gt;The above visuals reflect car usage (predominantly private car given that the focus is on residential carparks).
Given that Singapore&amp;rsquo;s car ownership rate is among the lowest in the world and due to &lt;a href="https://blog.seedly.sg/buy-car-how-much-should-be-earning/" target="_blank" rel="noopener"&gt;the highest cost in the world it is associated with income&lt;/a&gt;, the above might not be entirely representative of the entire population (although ride-hailing being prominent, the number of residents travelling is effectively higher than the car ownership rate suggests and should be more heterogeneous).&lt;/p&gt;
&lt;p&gt;The only sufficiently detailed openly available dataset on public transportation I have found is the one released by &lt;a href="https://citymapper.com" target="_blank" rel="noopener"&gt;Citymapper&lt;/a&gt;, as part of the &lt;a href="https://citymapper.com/cmi" target="_blank" rel="noopener"&gt;Citymapper Mobility Index&lt;/a&gt;, hinting at the daily change of the city moving compared to the usual trends (the index is calculated by comparing trips planned in the Citymapper app to a recent typical usage period).&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_hu_e815c9d88ff1f523.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_hu_ad50a48223b5e2ae.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_hu_bd650f4d216f8ab1.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_hu_e815c9d88ff1f523.webp"
width="760"
height="447"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;The app is used mainly for public transportation, but the data also includes walking and cycling.
The data covers the period as of 2 March 2020.
As expected, it also indicates a considerable drop as the Circuit Breaker measures were put in place, but also that social distancing guidelines were being taken seriously well before then.
Good job, Singapore.&lt;/p&gt;
&lt;h3 id="comparison-with-global-trends"&gt;Comparison with global trends&lt;/h3&gt;
&lt;p&gt;The good thing is that the same dataset also covers 40 other cities around the world, enabling a comparison.
How does Singapore stand with other cities?
Looking at the rest of the world, life in Singapore continued at relatively fairly normal levels longer than elsewhere, with a gradual decrease (presumably reducing mobility only to essentials).&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_hu_b8d26eefa829904f.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_hu_bc9c153241928cad.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_hu_3f34f9a7a1a62393.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_hu_b8d26eefa829904f.webp"
width="760"
height="581"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;For a more detailed and separate overview of cities, the cities are plotted separately:&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_panels_hu_2ce60d32a36a1ffc.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_panels_hu_6f8e8584c86875f7.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_panels_hu_a23c92f084974e4d.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_multi-city_panels_hu_2ce60d32a36a1ffc.webp"
width="670"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;The above exposes cities that have either not implemented stringent measures, or where the rules are flouted.&lt;/p&gt;
&lt;p&gt;Finally, the next one shows the change in the period of two weeks until 9 Apr (before the current weekend as Easter would bias the trend).
It helps in understanding which cities have slowed down recently.
Almost all of the cities have decreased mobility according to Citymapper data, or it remained at approximately the same levels where it already well decreased (the slight increase may be within the margin of error).&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_dumbbell_hu_fca68ca8483ff09b.webp 400w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_dumbbell_hu_b8a24582c441b831.webp 760w,
/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_dumbbell_hu_b20be8f92146a6c9.webp 1200w"
src="https://ual.sg/post/2020/04/12/singapores-urban-data-affirms-the-compliance-with-the-circuit-breaker-measures/cmi-2020-04-11_dumbbell_hu_fca68ca8483ff09b.webp"
width="670"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;p&gt;Singapore experienced the sharpest drop among all 41 cities.
This may be good news as it indicates that people are taking the measures seriously, drastically changing their behaviour overnight.&lt;/p&gt;
&lt;h3 id="other-urban-aspects"&gt;Other urban aspects&lt;/h3&gt;
&lt;p&gt;Urban data is more than transportation data.
For example, I also looked into air quality, but there does not seem to be a significant change so far.
Some urban datasets will take time to become available, such as real estate transactions.
They are left for another analysis in future.&lt;/p&gt;
&lt;h2 id="method"&gt;Method&lt;/h2&gt;
&lt;p&gt;How were these data visualisations made?
The data and tools used to create this analysis are entirely based on open (public) data and open-source tools.
This section also includes a brief mention of alternative datasets that were considered, and may be subject of future analyses.&lt;/p&gt;
&lt;h3 id="data-car-traffic"&gt;Data: car traffic&lt;/h3&gt;
&lt;p&gt;To the extent of my knowledge, there is no publicly available dataset on road transportation in Singapore, such as the number of cars currently on roads, so much has to be improvised and assumptions have to be made.&lt;/p&gt;
&lt;p&gt;The &lt;a href="https://www.lta.gov.sg/" target="_blank" rel="noopener"&gt;Land Transport Authority&lt;/a&gt; provides several static and dynamic open datasets related to transportation.
Among them, there is a resource with real-time travel times across quite a few roads in Singapore, distributed through their &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall/dynamic-data.html" target="_blank" rel="noopener"&gt;API&lt;/a&gt; as part of the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt;.
This dataset may suggest the number of cars on the roads.
However, it seems that the API does not allow querying historical data; it only returns the current situation, so past trends cannot be retrieved.&lt;/p&gt;
&lt;p&gt;For this analysis, the carpark availability was used.
The real-time data is available through the &lt;a href="https://data.gov.sg/dataset/carpark-availability" target="_blank" rel="noopener"&gt;Carpark Availability API&lt;/a&gt; as part of the &lt;a href="https://data.gov.sg/developer" target="_blank" rel="noopener"&gt;Data.gov.sg developer resources&lt;/a&gt;.
The API returns live information about 2000 carparks in Singapore, managed by HDB, LTA, and URA, and for each, it returns the number of available lots, total lots, type of lots, agency in charge, and so on.
It is refreshed every minute, and it allows retrieving carpark availability well into the past.&lt;/p&gt;
&lt;p&gt;I considered only HDB carparks, to focus on the residential aspect.
And I filtered out lots that are not for cars (e.g. those for trucks).
Therefore, the data above presents a subset of all Singapore carparks/cars/traffic, but I believe that it is sufficiently large and representative enough to infer some pretty solid trends.&lt;/p&gt;
&lt;p&gt;Carpark availability, in the absence of better data, should be a reasonably good proxy for movement of people by car.
Sure, there are disadvantages and limitations to this approach.
For example, a resident can leave their residence to enter the carpark at their friend&amp;rsquo;s estate, resulting in a net zero record while they moved, but on a large scale it nevertheless gives confidence that it suggests movement in a relatively accurate manner to get a good picture of the use of cars.&lt;/p&gt;
&lt;p&gt;About the second plot, showing the absolute number of parked cars, in comparison with the earlier year, one may argue that there are more lots taken now because the car population grew (there is no publicly available data for 2020 yet to confirm this).
However, given the &lt;a href="https://www.bbc.com/news/business-41730778" target="_blank" rel="noopener"&gt;recent freeze of the number of private vehicles in Singapore&lt;/a&gt;, I doubt that the car population changed notably, if at all.&lt;/p&gt;
&lt;h3 id="data-transit"&gt;Data: transit&lt;/h3&gt;
&lt;p&gt;The &lt;a href="https://citymapper.com/cmi" target="_blank" rel="noopener"&gt;Citymapper Mobility Index&lt;/a&gt; mainly reflects public transport, but the app is also used for walking, cycling, and some micromobility and cabs.
It does not differentiate different transportation modes, and as Citymapper &lt;a href="https://citymapper.com/cmi/about" target="_blank" rel="noopener"&gt;highlights&lt;/a&gt;, it represents a subset of general mobility.
Thus, there may be some deviations to the real world.
A disadvantage of the dataset may be that the usual trends also include tourists, not only locals.
Therefore, a drop in mobility might reflect not only the compliance with social distancing measures but also a drop in tourist arrivals.&lt;/p&gt;
&lt;p&gt;Regarding public transport, LTA provides the Origin-Destination data at the &lt;a href="https://www.mytransport.sg/content/mytransport/home/dataMall.html" target="_blank" rel="noopener"&gt;LTA DataMall&lt;/a&gt; for both bus and MRT, but it is rather on a monthly basis, without a breakdown by day.
It will undoubtedly be useful to check it out once data for March and April are revealed, to get some general trends at least.&lt;/p&gt;
&lt;p&gt;Also, well worth mentioning in this context is the &lt;a href="https://www.spaceout.gov.sg/" target="_blank" rel="noopener"&gt;URA&amp;rsquo;s Space Out&lt;/a&gt; &amp;mdash; &lt;a href="https://www.ura.gov.sg/Corporate/Media-Room/Media-Releases/pr20-15" target="_blank" rel="noopener"&gt;released a week ago&lt;/a&gt; &amp;mdash; providing regular updates on the crowd levels in malls across Singapore.
The data does not appear to be available for download, but the website may provide some insights into the change in the footfall in malls.&lt;/p&gt;
&lt;h3 id="tools"&gt;Tools&lt;/h3&gt;
&lt;p&gt;The data was processed, transformed, and visualised using
&lt;i class="fab fa-r-project pr-1 fa-fw"&gt;&lt;/i&gt; and &lt;code&gt;ggplot2&lt;/code&gt;.
The used theme is from the package &lt;a href="https://github.com/hrbrmstr/hrbrthemes" target="_blank" rel="noopener"&gt;&lt;code&gt;hrbrthemes&lt;/code&gt;&lt;/a&gt;, with some modifications.&lt;/p&gt;
&lt;h3 id="acknowledgements"&gt;Acknowledgements&lt;/h3&gt;
&lt;p&gt;Both datasets used are gratefully acknowledged.&lt;/p&gt;
&lt;h3 id="update"&gt;Update&lt;/h3&gt;
&lt;p&gt;Some of the plots in this article have been updated two weeks later in a new &lt;a href="https://ual.sg/post/2020/04/24/updated-plots-on-the-current-mobility-situation-around-the-world/"&gt;post&lt;/a&gt; using the latest available data.&lt;/p&gt;
&lt;p&gt;&lt;code&gt;#StayHomeForSG&lt;/code&gt;&lt;/p&gt;
&lt;div class="footnotes" role="doc-endnotes"&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id="fn:1"&gt;
&lt;p&gt;There are though many visualisations about the spread of virus, cases, hospital discharges, etc. For example: there is a &lt;a href="https://co.vid19.sg/singapore/" target="_blank" rel="noopener"&gt;dashboard of the outbreak in Singapore
&lt;/a&gt;.&amp;#160;&lt;a href="#fnref:1" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:2"&gt;
&lt;p&gt;Straits Times: &lt;a href="https://www.straitstimes.com/singapore/transport/public-transport-usage-plummets-as-more-stay-home" target="_blank" rel="noopener"&gt;Public transport usage plummets as more stay home&lt;/a&gt;, 26 Mar 2020.&amp;#160;&lt;a href="#fnref:2" class="footnote-backref" role="doc-backlink"&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;</description></item><item><title>OGC considering CityJSON as community standard</title><link>https://ual.sg/post/2020/02/15/ogc-considering-cityjson-as-community-standard/</link><pubDate>Sat, 15 Feb 2020 19:50:15 +0800</pubDate><guid>https://ual.sg/post/2020/02/15/ogc-considering-cityjson-as-community-standard/</guid><description>&lt;p&gt;We&amp;rsquo;re relaying &lt;a href="https://www.opengeospatial.org/pressroom/pressreleases/3152" target="_blank" rel="noopener"&gt;a press release from the Open Geospatial Consortium&lt;/a&gt; about &lt;a href="https://cityjson.org" target="_blank" rel="noopener"&gt;CityJSON&lt;/a&gt;, a standard which we endorsed as OGC member, and have been using in our work, e.g. &lt;a href="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/"&gt;to produce 3D data of Singapore&amp;rsquo;s public housing buildings&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="ogc-considering-cityjson-as-community-standard-seeks-public-comment-for-new-work-item"&gt;OGC considering CityJSON as community standard; seeks public comment for new Work Item&lt;/h2&gt;
&lt;h4 id="release-date-thursday-13-february-2020-utc"&gt;Release Date: Thursday, 13 February 2020 UTC&lt;/h4&gt;
&lt;p&gt;CityJSON provides a simplified alternative to the GML encoding of CityGML that is also lightweight and suitable for use on the web and mobile.&lt;/p&gt;
&lt;p&gt;The Open Geospatial Consortium (OGC) is considering CityJSON for adoption as an official OGC Community Standard. A new Work Item justification to begin the Community Standard endorsement process is available for public comment.&lt;/p&gt;
&lt;p&gt;CityJSON is a JSON-based encoding for a subset of the &lt;a href="https://www.opengeospatial.org/standards/citygml" target="_blank" rel="noopener"&gt;OGC CityGML data model&lt;/a&gt;, which is an open standardized data model and exchange format to store digital 3D models of cities and landscapes.&lt;/p&gt;
&lt;p&gt;CityJSON defines ways to describe most of the common 3D features and objects found in cities (such as buildings, roads, rivers, bridges, vegetation, and city furniture) and the relationships between them. It also defines different standard levels of detail (LoDs) for the 3D objects, which allows different resolutions of objects for different applications and purposes. CityJSON considerably simplifies the storage and exchange of 3D city models.&lt;/p&gt;
&lt;p&gt;The purpose of CityJSON is to offer an alternative to the GML encoding of CityGML, which can be verbose and therefore complex to work with. The design objective for CityJSON is ease of use for both reading datasets and for creating them. CityJSON was designed with programmers in mind, therefore tools and APIs supporting it can be quickly built. It was also designed to be compact - using CityJSON typically compresses publicly available CityGML files by a factor of 6x - while at the same time being friendly for web and mobile development.&lt;/p&gt;
&lt;p&gt;CityJSON was developed, and is maintained, by the &lt;a href="https://3d.bk.tudelft.nl/" target="_blank" rel="noopener"&gt;3D geoinformation group at TU Delft&lt;/a&gt;. Others have since joined its development, especially &lt;a href="https://www.virtualcitysystems.de/" target="_blank" rel="noopener"&gt;virtualcitySYSTEMS&lt;/a&gt; and Claus Nagel.&lt;/p&gt;
&lt;p&gt;An approved &lt;a href="http://www.opengeospatial.org/blog/2543" target="_blank" rel="noopener"&gt;OGC Community Standard&lt;/a&gt; is an official standard of OGC that is considered to be a widely used, mature specification, but was developed outside of OGC’s standards development and approval process. The originator of the standard brings to OGC a “snapshot” of their work that is then endorsed by OGC membership so that it can become part of the OGC Standards Baseline.&lt;/p&gt;
&lt;p&gt;The proposed CityJSON community standard work item justification is available for review and comment on the &lt;a href="https://portal.opengeospatial.org/files/91843" target="_blank" rel="noopener"&gt;OGC Portal&lt;/a&gt;. Comments are due by 5th March, 2020, and should be submitted via the method outlined on the &lt;a href="https://www.opengeospatial.org/standards/requests/200" target="_blank" rel="noopener"&gt;CityJSON community standard work item public comment request page&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="about-ogc"&gt;About OGC&lt;/h3&gt;
&lt;p&gt;The Open Geospatial Consortium (OGC) is an international consortium of more than 530 businesses, government agencies, research organizations, and universities driven to make geospatial (location) information and services FAIR - Findable, Accessible, Interoperable, and Reusable.
OGC’s member-driven consensus process creates royalty free, publicly available geospatial standards. Existing at the cutting edge, OGC actively analyzes and anticipates emerging tech trends, and runs an agile, collaborative Research and Development (R&amp;amp;D) lab that builds and tests innovative prototype solutions to members&amp;rsquo; use cases.
OGC members together form a global forum of experts and communities that use location to connect people with technology and improve decision-making at all levels. OGC is committed to creating a sustainable future for us, our children, and future generations.
Visit &lt;a href="http://ogc.org/" target="_blank" rel="noopener"&gt;ogc.org&lt;/a&gt; for more info on our work.&lt;/p&gt;
&lt;h4 id="contact"&gt;Contact: &lt;a href="mailto:info@opengeospatial.org"&gt;info@opengeospatial.org&lt;/a&gt;&lt;/h4&gt;</description></item><item><title>Introducing the application-driven LOD modeling paradigm for 3D building models</title><link>https://ual.sg/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/</link><pubDate>Tue, 04 Feb 2020 09:50:15 +0800</pubDate><guid>https://ual.sg/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/</guid><description>
&lt;figure id="figure-shadow-calculation-for-five-building-models-top-and-their-derived-compact-counterparts-bottom-where-it-can-be-found-that-the-areas-of-the-building-shadows-are-nearly-the-same-while-the-number-of-triangles-has-been-greatly-reduced"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Shadow calculation for five building models (top) and their derived compact counterparts (bottom), where it can be found that the areas of the building shadows are nearly the same while the number of triangles has been greatly reduced." srcset="
/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/featured_hu_55f1273000e0733b.webp 400w,
/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/featured_hu_af05eedcc7de6004.webp 760w,
/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/featured_hu_60e2c1cf72953634.webp 1200w"
src="https://ual.sg/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/featured_hu_55f1273000e0733b.webp"
width="760"
height="451"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Shadow calculation for five building models (top) and their derived compact counterparts (bottom), where it can be found that the areas of the building shadows are nearly the same while the number of triangles has been greatly reduced.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;We published a new collaborative &lt;a href="https://ual.sg/publication/2020-ijprs-application-lod/"&gt;paper&lt;/a&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Tang L, Ying S, Li L, Biljecki F, Zhu H, Zhu Y, Yang F, Su F (2020): An application-driven LOD modeling paradigm for 3D building models. &lt;em&gt;ISPRS Journal of Photogrammetry and Remote Sensing&lt;/em&gt; 161:194-207. &lt;a href="https://doi.org/10.1016/j.isprsjprs.2020.01.019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.1016/j.isprsjprs.2020.01.019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2020-ijprs-application-lod/2020-ijprs-application-lod.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The first author is &lt;a href="https://www.researchgate.net/profile/Lei_Tang29" target="_blank" rel="noopener"&gt;Lei Tang&lt;/a&gt; of Wuhan University.&lt;/p&gt;
&lt;p&gt;The level of detail (LOD) concept for 3D building models, which indicates the degree of closeness between a model and its real-world counterpart, is deeply rooted among the stakeholders in the field of urban research and 3D geoinformation. However, with the increasing use and demand of a wide range of applications, the LOD definition standardized by the City Geography Markup Language (CityGML) standard appears to be generic, potentially resulting in redundancy and inflexibility. To address this issue, we reconsider the LOD concept from an application point of view and suggest a new context-aware heterogeneous LOD modeling paradigm for 3D building models tailored to specific applications. The new proposal challenges the original homogeneous generic modeling logic and instead adopts a bottom-up approach, putting the focus on the building components rather than on the building itself, resulting in models that may lead to a better fitness for use. In this paper, we first specify a number of discrete LODs for building component models, called CLODs, and then assemble them to derive the LODs of building models suited for particular applications, diminishing redundancy and being tailored for a specific application. To obtain the appropriate LOD specification, we introduce five essential evaluation criteria and a series of semantic and geometrically assembled constraints on the CLODs. We implement two experiments, outdoor component selection and indoor furniture simulation, and conclude that the proposed application-driven LOD definition is more suited in the context of particular applications.&lt;/p&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2020-ijprs-application-lod/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;Congratulations to Lei Tang for publishing a part of his PhD research in one of the top journals in the field! &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2020-ijprs-application-lod/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/page-one_hu_475c5626c3b1623c.webp 400w,
/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/page-one_hu_a0acce3bb76752fa.webp 760w,
/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/page-one_hu_336436b95eaf5c7f.webp 1200w"
src="https://ual.sg/post/2020/02/04/introducing-the-application-driven-lod-modeling-paradigm-for-3d-building-models/page-one_hu_475c5626c3b1623c.webp"
width="600"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2020_ijprs_application_lod&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Tang, Lei and Ying, Shen and Li, Lin and Biljecki, Filip and Zhu, HaiHong and Zhu, Yi and Yang, Fan and Su, Fei}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{An application-driven LOD modeling paradigm for 3D building models}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Journal of Photogrammetry and Remote Sensing}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2020}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{161}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{194--207}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.1016/j.isprsjprs.2020.01.019}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Thesis opportunities for undergrad and grad students</title><link>https://ual.sg/post/2019/12/19/thesis-opportunities-for-undergrad-and-grad-students/</link><pubDate>Thu, 19 Dec 2019 05:47:19 +0800</pubDate><guid>https://ual.sg/post/2019/12/19/thesis-opportunities-for-undergrad-and-grad-students/</guid><description>&lt;p&gt;We are half way through the academic year, and many NUS students are looking for a topic for their dissertation/thesis, or simply to learn something new.
We created &lt;a href="https://ual.sg/teaching/#theses-dissertations-and-capstone-projects"&gt;a list of student projects&lt;/a&gt; within the frame of GIS, 3D city modelling &amp;amp; urban analytics.
Our topics are suitable for different study programmes, from Geography to Computer Science, and relate to thrilling real-world problems and enable learning new skills.
Thanks to the &lt;a href="http://www.nus.edu.sg/registrar/education-at-nus/non-graduating-programme.html" target="_blank" rel="noopener"&gt;NUS non-graduating non-exchange scheme&lt;/a&gt;, students from outside NUS and Singapore are eligible to participate as well.&lt;/p&gt;
&lt;p&gt;Have a look at the &lt;a href="https://ual.sg/teaching/#theses-dissertations-and-capstone-projects"&gt;list&lt;/a&gt;, and feel free to get in touch.&lt;/p&gt;</description></item><item><title>Funding opportunities for prospective PhD candidates and postdoctoral researchers</title><link>https://ual.sg/post/2019/12/13/funding-opportunities-for-prospective-phd-candidates-and-postdoctoral-researchers/</link><pubDate>Fri, 13 Dec 2019 09:55:19 +0800</pubDate><guid>https://ual.sg/post/2019/12/13/funding-opportunities-for-prospective-phd-candidates-and-postdoctoral-researchers/</guid><description>&lt;p&gt;We prepared &lt;a href="https://ual.sg/openings/"&gt;a list&lt;/a&gt; of external funding opportunities to carry out a PhD, postdoc, or visiting research with us.
Are you interested in applying for one of these to conduct research in GIS, 3D &amp;amp; urban analytics?
&lt;a href="https://ual.sg/#contact"&gt;Contact us&lt;/a&gt;.
Visit our &lt;a href="https://ual.sg/openings/"&gt;opportunities page&lt;/a&gt; for the full list.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/12/13/funding-opportunities-for-prospective-phd-candidates-and-postdoctoral-researchers/featured_hu_e9a1112bef778f7.webp 400w,
/post/2019/12/13/funding-opportunities-for-prospective-phd-candidates-and-postdoctoral-researchers/featured_hu_9797c56eb93be6e7.webp 760w,
/post/2019/12/13/funding-opportunities-for-prospective-phd-candidates-and-postdoctoral-researchers/featured_hu_d2cf57b048f93e96.webp 1200w"
src="https://ual.sg/post/2019/12/13/funding-opportunities-for-prospective-phd-candidates-and-postdoctoral-researchers/featured_hu_e9a1112bef778f7.webp"
width="760"
height="660"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Visit to Seoul, Korea</title><link>https://ual.sg/post/2019/12/08/visit-to-seoul-korea/</link><pubDate>Sun, 08 Dec 2019 20:55:19 +0800</pubDate><guid>https://ual.sg/post/2019/12/08/visit-to-seoul-korea/</guid><description>&lt;p&gt;Dr &lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; has visited Korea to strengthen the collaborations and network of the &lt;a href="https://ual.sg/author/urban-analytics-lab/"&gt;Urban Analytics Lab&lt;/a&gt; in East Asia.
This is a follow up visit 6 months after &lt;a href="https://ual.sg/post/2019/05/22/visits-to-korea-and-japan/"&gt;the roadshow in the region&lt;/a&gt;.
The visit to Seoul included presentations and meetings at the University of Seoul and the company Allforland.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/12/08/visit-to-seoul-korea/1_hu_f921744b917e476a.webp 400w,
/post/2019/12/08/visit-to-seoul-korea/1_hu_baf67803317df2c.webp 760w,
/post/2019/12/08/visit-to-seoul-korea/1_hu_4f1bd5d7b8c649c2.webp 1200w"
src="https://ual.sg/post/2019/12/08/visit-to-seoul-korea/1_hu_f921744b917e476a.webp"
width="760"
height="455"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Korean organisations have been leading the development in indoor spatial data modelling.
For example, a large portion of the OGC standard &lt;a href="http://www.indoorgml.net" target="_blank" rel="noopener"&gt;IndoorGML&lt;/a&gt; was developed thanks to extensive research and development done at Korean universities.
We are pleased to be able to get a sneak peek in these developments.&lt;/p&gt;
&lt;p&gt;Previous and current work of our Lab has been presented.
We look forward to collaborate and learn more from our Korean partners, and to contribute with our expertise in developing specifications and data models.&lt;/p&gt;
&lt;p&gt;감사합니다.&lt;/p&gt;</description></item><item><title>Call for papers for our workshop at ACM ICMR 2020 in Dublin</title><link>https://ual.sg/post/2019/12/02/call-for-papers-for-our-workshop-at-acm-icmr-2020-in-dublin/</link><pubDate>Mon, 02 Dec 2019 17:18:30 +0800</pubDate><guid>https://ual.sg/post/2019/12/02/call-for-papers-for-our-workshop-at-acm-icmr-2020-in-dublin/</guid><description>&lt;p&gt;We are involved in the organisation of the &lt;a href="http://www2.nict.go.jp/bidal/icdar_icmr2020/index.html" target="_blank" rel="noopener"&gt;Workshop on Intelligent Cross-Data Analysis and Retrieval (ICDAR)&lt;/a&gt; at the the &lt;a href="http://www.icmr2020.org" target="_blank" rel="noopener"&gt;Annual ACM International Conference on Multimedia Retrieval (ICMR) 2020&lt;/a&gt; to be held in Dublin, Ireland.
The event is scheduled for 8 July 2020.&lt;/p&gt;
&lt;p&gt;The goal of the workshop is to attract researchers and experts in the areas of multimedia information retrieval, machine learning, AI, data science, event-based processing and analysis, multimodal multimedia content analysis, lifelog data analysis, urban computing, environmental science, and atmospheric science to tackle the intelligent cross-data analysis issue.
The accepted papers are expected to be published in the workshop proceedings.
Excellent papers are encouraged to submit to journals or a special issue that will be organized by the organizers.&lt;/p&gt;
&lt;p&gt;Please check the call for papers below or head to the &lt;a href="http://www2.nict.go.jp/bidal/icdar_icmr2020/index.html" target="_blank" rel="noopener"&gt;official website&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="call-for-papers"&gt;Call for papers&lt;/h2&gt;
&lt;p&gt;Currently, people can collect data from themselves and their surrounding environment quickly due to the exponential development of sensors and communication technologies and social networks. The ability to collect such data opens the new opportunity to understand better the association between human beings and the properties of the surrounding environment. These associations can be utilized for intelligence, planning, controlling, retrieval, and decision making efficiently and effectively by governments, industries, and citizens. Wearable sensors, lifelog cameras, and social networks can report people’s health, activities, and behaviors by the first-view perspective while surrounding sensors, social networks interaction, and third-party data can give the third-view perspective of how their society activities look like. Several investigations have been done to deal with each perspective, but few investigations focus on how to analyze and retrieve cross-data come from different perspectives to bring better benefits to human beings. The target of the workshop is to attract researchers to work on the intelligent cross-data analysis and retrieval to bring the smart sustainable society to human beings. The domain of the research can vary from wellbeing, disaster prevention &amp;amp; mitigation, mobility, to food computing, to name a few.&lt;/p&gt;
&lt;h3 id="topics"&gt;Topics&lt;/h3&gt;
&lt;p&gt;Example topics of interest include but is not limited to the following:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Event-based cross-data retrieval&lt;/li&gt;
&lt;li&gt;Data mining and AI technology to discover and predict spatial-temporal-semantic correlations between cross-data.&lt;/li&gt;
&lt;li&gt;Complex event processing for linking sensors data from individuals, regions, to broad areas dynamically.&lt;/li&gt;
&lt;li&gt;Transfer Learning from one region to another region to construct or customize similar analysis and prediction of events using locally-collected data effectively and efficiently.&lt;/li&gt;
&lt;li&gt;Hypotheses Development of the associations within the heterogeneous data contributes towards building good multimodal models that make it possible to understand the impact of the surrounding environment on human beings at the local and individual scale.&lt;/li&gt;
&lt;li&gt;Realization of a prosperous and independent region in which people and nature coexist.&lt;/li&gt;
&lt;li&gt;Applications leverage intelligent cross-data analysis for a particular domain.&lt;/li&gt;
&lt;li&gt;Cross-datasets for Repeatable Experimentation.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="key-dates"&gt;Key dates&lt;/h3&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Mar 12, 2020&lt;/td&gt;
&lt;td&gt;Deadline for Workshop Paper Submission&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Mar 15, 2020&lt;/td&gt;
&lt;td&gt;Notification of Acceptance for Demo Papers&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Mar 28, 2020&lt;/td&gt;
&lt;td&gt;Notification of Acceptance for Workshop Papers&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;April 15, 2020&lt;/td&gt;
&lt;td&gt;Camera-Ready Workshop Papers Due&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Jun 8, 2020&lt;/td&gt;
&lt;td&gt;ICMR 2020 Workshops Day&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h3 id="people"&gt;People&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Minh-Son Dao, National Institute of Information and Communications Technology (NICT), Japan&lt;/li&gt;
&lt;li&gt;Morten Fjeld, University of Bergen, Norway&lt;/li&gt;
&lt;li&gt;Uraz Yavanoglu, Gazi University, Turkey&lt;/li&gt;
&lt;li&gt;Filip Biljecki, National University of Singapore, Singapore&lt;/li&gt;
&lt;li&gt;Mianxiong Dong, Muroran Institute of Technology, Japan&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Our participation at the World Economic Forum - Global Future Councils 2019</title><link>https://ual.sg/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/</link><pubDate>Tue, 05 Nov 2019 20:02:11 +0400</pubDate><guid>https://ual.sg/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/</guid><description>
&lt;figure id="figure-c-world-economic-forum--benedikt-von-loebell"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="(c) World Economic Forum / Benedikt von Loebell" srcset="
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004908821_541111e926_o_hu_9f59795986027390.webp 400w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004908821_541111e926_o_hu_2c8f981bd107bb8c.webp 760w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004908821_541111e926_o_hu_9a93b06b4779ab3a.webp 1200w"
src="https://ual.sg/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004908821_541111e926_o_hu_9f59795986027390.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
(c) World Economic Forum / Benedikt von Loebell
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;The &lt;a href="https://www.weforum.org" target="_blank" rel="noopener"&gt;World Economic Forum&lt;/a&gt;’s &lt;a href="https://www.weforum.org/communities/global-future-councils" target="_blank" rel="noopener"&gt;Global Future Councils&lt;/a&gt; are the world’s foremost interdisciplinary knowledge network dedicated to promoting innovative thinking to shape a sustainable and inclusive future for all.
The network convenes more than 700 of the most relevant and knowledgeable thought leaders from academia, government, business and civil society, grouped in expertise-based, thematic councils. It is an invitation-only community, and members are nominated for a one-year period.&lt;/p&gt;
&lt;figure id="figure-c-world-economic-forum--benedikt-von-loebell"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="(c) World Economic Forum / Benedikt von Loebell" srcset="
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004909266_89f8736ffd_o_hu_ba1ccacb3a53180e.webp 400w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004909266_89f8736ffd_o_hu_a65d55f84084dc3a.webp 760w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004909266_89f8736ffd_o_hu_c65c112cac294b28.webp 1200w"
src="https://ual.sg/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49004909266_89f8736ffd_o_hu_ba1ccacb3a53180e.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
(c) World Economic Forum / Benedikt von Loebell
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;As in the previous years, this year the councils convened in Dubai.
&lt;a href="https://ual.sg/author/filip-biljecki/"&gt;Filip Biljecki&lt;/a&gt; is taking part in the &lt;a href="https://www.weforum.org/communities/the-future-of-cities-and-urbanization" target="_blank" rel="noopener"&gt;Global Future Council on Cities and Urbanization&lt;/a&gt; for the second year, working with cities to find the innovation, capacity and financing necessary to address their challenges.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-c-world-economic-forum--benedikt-von-loebell"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="(c) World Economic Forum / Benedikt von Loebell" srcset="
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007653427_2a8324ebdd_o_hu_47ebc5fdfd16bffb.webp 400w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007653427_2a8324ebdd_o_hu_bf973070a7b859d8.webp 760w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007653427_2a8324ebdd_o_hu_75aa47b32a6e07eb.webp 1200w"
src="https://ual.sg/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007653427_2a8324ebdd_o_hu_47ebc5fdfd16bffb.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
(c) World Economic Forum / Benedikt von Loebell
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;figure id="figure-c-world-economic-forum--benedikt-von-loebell"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="(c) World Economic Forum / Benedikt von Loebell" srcset="
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007438276_d18565c611_o_hu_9185c0ac4d04c1ad.webp 400w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007438276_d18565c611_o_hu_8779daff9166e58e.webp 760w,
/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007438276_d18565c611_o_hu_abd74bcc447e33e4.webp 1200w"
src="https://ual.sg/post/2019/11/05/our-participation-at-the-world-economic-forum-global-future-councils-2019/49007438276_d18565c611_o_hu_9185c0ac4d04c1ad.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
(c) World Economic Forum / Benedikt von Loebell
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;In this mandate the council will explore strategies to achieve low carbon cities.
We are particularly interested in discovering how can 3D city models help us achieving that goal.&lt;/p&gt;</description></item><item><title>Welcome Jonas</title><link>https://ual.sg/post/2019/10/21/welcome-jonas/</link><pubDate>Mon, 21 Oct 2019 12:10:41 +0800</pubDate><guid>https://ual.sg/post/2019/10/21/welcome-jonas/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/10/21/welcome-jonas/featured_hu_e36d563b6a17afa6.webp 400w,
/post/2019/10/21/welcome-jonas/featured_hu_deec467bd2e54d73.webp 760w,
/post/2019/10/21/welcome-jonas/featured_hu_b7db73e6d24d7453.webp 1200w"
src="https://ual.sg/post/2019/10/21/welcome-jonas/featured_hu_e36d563b6a17afa6.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;It is a pleasure to have a new visiting scholar at our lab.
Dr &lt;a href="https://ual.sg/author/jonas-teuwen/"&gt;Jonas Teuwen&lt;/a&gt; completed his PhD in Applied Mathematics, and is currently assistant professor at the Radboud University Nijmegen in the Netherlands.
Jonas&amp;rsquo; principal research focus is on the development of efficient deep learning algorithms in imaging.&lt;/p&gt;
&lt;p&gt;At the NUS Urban Analytics Lab he will work on applications of machine learning in architecture and real estate.&lt;/p&gt;
&lt;p&gt;Welcome Jonas, and enjoy your stay at NUS and in Singapore.&lt;/p&gt;</description></item><item><title>Participation at Geospatial Kuala Lumpur 2019</title><link>https://ual.sg/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/</link><pubDate>Thu, 03 Oct 2019 16:31:54 +0800</pubDate><guid>https://ual.sg/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/</guid><description>&lt;p&gt;We attended the &lt;a href="https://www.geoinfo.utm.my/geospatial2019/" target="_blank" rel="noopener"&gt;Geospatial KL 2019 International Conference&lt;/a&gt;, which features the following three major conferences in one place:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.geoinfo.utm.my/ggt2019/" target="_blank" rel="noopener"&gt;6th International Conference Geomatics &amp;amp; Geospatial Technology&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.geoinfo.utm.my/sdsc2019/" target="_blank" rel="noopener"&gt;4th International Conference on Smart Data and Smart Cities&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://isoladm.org/LADM2019Workshop" target="_blank" rel="noopener"&gt;8th FIG Workshop on the Land Administration Domain Model (LADM)&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This big event was held at the Hotel Istana in Kuala Lumpur, Malaysia during the first three days of October 2019.&lt;/p&gt;
&lt;p&gt;The conference was very well attended having an impressive number of 300+ participants from 37 countries, and several exhibitors.&lt;/p&gt;
&lt;p&gt;We also presented a &lt;a href="https://ual.sg/publication/2019-sdsc-airbnb-beijing/"&gt;paper&lt;/a&gt; at the Smart Data and Smart Cities conference, based on the master thesis of &lt;a href="https://ual.sg/authors/jialin/"&gt;Li Jialin&lt;/a&gt; who graduated from the &lt;a href="http://www.nus.edu.sg/nusbulletin/school-of-design-and-environment/graduate-education/coursework-programmes/degree-requirements/master-of-urban-planning/" target="_blank" rel="noopener"&gt;Master in Urban Planning programme at NUS&lt;/a&gt; earlier this year.
The conference papers have been published in &lt;a href="https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-4-W17/" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt; and &lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/IV-4-W9/" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We thank the organisers, especially the conference chair Professor Sr Dr Alias bin Abdul Rahman (&lt;a href="http://builtsurvey.utm.my" target="_blank" rel="noopener"&gt;3D GIS Research Lab, Faculty of Built Environment and Surveying, Universiti Teknologi Malaysia&lt;/a&gt;), for the excellent organisation.
We also express our gratitude to the exhibitors, and fellow researchers for contributing to the conference with a variety of interesting papers and ideas.&lt;/p&gt;
&lt;p&gt;Together with the &lt;a href="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/"&gt;3D GeoInfo 2019 conference hosted last week in Singapore&lt;/a&gt;, it was fantastic to have two large and prominent geospatial events in Southeast Asia in such a short period of time.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/1_hu_19fc4d499f91f227.webp 400w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/1_hu_d12d078a6747e276.webp 760w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/1_hu_49f8d833ed4d6a94.webp 1200w"
src="https://ual.sg/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/1_hu_19fc4d499f91f227.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/2_hu_c69c2d1695e63073.webp 400w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/2_hu_499f7f790ed4edc7.webp 760w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/2_hu_c5d57869d546578b.webp 1200w"
src="https://ual.sg/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/2_hu_c69c2d1695e63073.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/3_hu_6d7a010f833cc193.webp 400w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/3_hu_45ba4db95af4d16f.webp 760w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/3_hu_b6db00094598a27c.webp 1200w"
src="https://ual.sg/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/3_hu_6d7a010f833cc193.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/4_hu_4241670606cc3612.webp 400w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/4_hu_91800f3737b797e9.webp 760w,
/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/4_hu_9140b37b222b9ed9.webp 1200w"
src="https://ual.sg/post/2019/10/03/participation-at-geospatial-kuala-lumpur-2019/4_hu_4241670606cc3612.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>3D GeoInfo 2019 in Singapore: a success. Thanks everyone!</title><link>https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/</link><pubDate>Sat, 28 Sep 2019 12:11:22 +0800</pubDate><guid>https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/</guid><description>&lt;p&gt;We are very pleased to have had this year&amp;rsquo;s &lt;a href="https://www.3dgeoinfo2019.com" target="_blank" rel="noopener"&gt;3D GeoInfo hosted in Singapore&lt;/a&gt;.
The 14th edition of the conference, co-organised by the &lt;a href="http://www.nus.edu.sg" target="_blank" rel="noopener"&gt;National University of Singapore&lt;/a&gt; and the &lt;a href="https://www1.sla.gov.sg" target="_blank" rel="noopener"&gt;Singapore Land Authority&lt;/a&gt;, was part of a broader event named 3D Singapore, which comprised also three pre-conference workshops: 2nd BIM/GIS Integration Workshop, Point Clouds Training, and Big Data and Urban Analytics Workshop.
The 4-day event, endorsed by the &lt;a href="https://www.isprs.org" target="_blank" rel="noopener"&gt;ISPRS&lt;/a&gt; and hosted at the Asian Civilisations Museum, was attended by about 200 participants from academia, industry, and public organisations; and it was supported by 6 exhibitors: &lt;a href="http://www.aamgroup.com/" target="_blank" rel="noopener"&gt;AAM&lt;/a&gt;, &lt;a href="https://www.bentley.com/en" target="_blank" rel="noopener"&gt;Bentley&lt;/a&gt;, &lt;a href="http://esrisingapore.com.sg/" target="_blank" rel="noopener"&gt;Esri Singapore&lt;/a&gt;, &lt;a href="https://www.gpslands.com/" target="_blank" rel="noopener"&gt;GPS Lands&lt;/a&gt;, &lt;a href="https://www.oracle.com/sg/corporate/contact/" target="_blank" rel="noopener"&gt;Oracle&lt;/a&gt;, and &lt;a href="https://www.yjpsurveyors.com/" target="_blank" rel="noopener"&gt;YJP Surveyors&lt;/a&gt;.
It was a great pleasure to take part in hosting this event.&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/2_hu_4bdee666c9907a6f.webp 400w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/2_hu_c66d1c07b967dff1.webp 760w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/2_hu_d2a3ac05dc290d56.webp 1200w"
src="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/2_hu_4bdee666c9907a6f.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;We contributed to the event with five papers, out of which one was presented as a keynote:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Dehbi Y (2019): Raise the roof: towards generating LoD2 models without aerial surveys using machine learning. &lt;em&gt;ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; IV-4/W8:27-34. &lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-27-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-IV-4-W8-27-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-inferring-roof-type/2019-inferring-roof-type.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Noardo F, Biljecki F, Agugiaro G, Arroyo Ohori K, Ellul C, Harrie L, Stoter J (2019): GeoBIM benchmark 2019: intermediate results. &lt;em&gt;Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; XLII-4/W15:47–52. &lt;a href="https://doi.org/10.5194/isprs-archives-XLII-4-W15-47-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-XLII-4-W15-47-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-geobim-intermediate/2019-geobim-intermediate.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Tauscher H (2019): Quality of BIM-GIS conversion. &lt;em&gt;ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; IV-4/W8:35–42. &lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-35-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-IV-4-W8-35-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-bim-gis-quality/2019-bim-gis-quality.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Stoter J, Ho S, Biljecki F (2019): Considerations for a contemporary 3D cadastre for our times. &lt;em&gt;Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; XLII-4/W15:81–88. &lt;a href="https://doi.org/10.5194/isprs-archives-XLII-4-W15-81-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-XLII-4-W15-81-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-considerations-3-d-cadastre/2019-considerations-3-d-cadastre.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Lim J, Tauscher H, Biljecki F (2019): Graph transformation rules for IFC-to-CityGML attribute conversion. &lt;em&gt;ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; IV-4/W8:83–90. &lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-83-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-IV-4-W8-83-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-graph-transformation-rules-ifc-citygml/2019-graph-transformation-rules-ifc-citygml.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;All the papers from the event are published open access in the &lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/IV-4-W8/" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt; (full papers) and &lt;a href="https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-4-W15/" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt; (short papers / extended abstracts).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/3_hu_7f5ec9cd30f8cde3.webp 400w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/3_hu_85dd278cb275d4a.webp 760w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/3_hu_41cc78d56a219ac4.webp 1200w"
src="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/3_hu_7f5ec9cd30f8cde3.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;The conference was opened by Mr Tan Boon Khai (Chief Executive of SLA) and Prof Lam Khee Poh (Dean of the School of Design and Environment at NUS).
We also had the privilege of having Prof Lui Pao Chuen (Advisor, &lt;a href="https://www.nrf.gov.sg" target="_blank" rel="noopener"&gt;National Research Foundation&lt;/a&gt;) to enrich the event as the guest of honour.
Besides the presentations of the 36 peer-reviewed papers, the conference had four invited speakers: Ms Yap Lay Bee (&lt;a href="https://www.ura.gov.sg/" target="_blank" rel="noopener"&gt;Singapore&amp;rsquo;s Urban Redevelopment Authority&lt;/a&gt;), Dr Thomas Reindl (&lt;a href="http://www.seris.nus.edu.sg" target="_blank" rel="noopener"&gt;Solar Energy Institute of Singapore&lt;/a&gt;), Mr Jarmo Suomisto (&lt;a href="http://www.hel.fi/3D" target="_blank" rel="noopener"&gt;Helsinki 3D+&lt;/a&gt;), and Mr Carsten Rönsdorf (&lt;a href="http://www.ordnancesurvey.co.uk" target="_blank" rel="noopener"&gt;Ordnance Survey&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/4_hu_203e07153848317c.webp 400w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/4_hu_4abdae7c50e0a7ed.webp 760w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/4_hu_c83d8bcc2e038528.webp 1200w"
src="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/4_hu_203e07153848317c.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;We also want to congratulate the winners of the best paper awards.&lt;/p&gt;
&lt;p&gt;The best paper award went to Y. Dehbi, A. Henn, G. Gröger, V. Stroh, and L. Plümer for their paper
&lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-43-2019" target="_blank" rel="noopener"&gt;Active sampling and model based prediction for fast and robust detection and reconstruction of complex roofs in 3D point clouds&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The runner-up best paper recognition was awarded to S. Vitalis, A. Labetski, K. Arroyo Ohori, H. Ledoux, and J. Stoter for their paper &lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-123-2019" target="_blank" rel="noopener"&gt;A data structure to incorporate versioning in 3D city models&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Congratulations &amp;#x1f3c6; &amp;#x1f44f;&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/5_hu_bea289514f7b7a7a.webp 400w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/5_hu_6140f678ac8782a3.webp 760w,
/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/5_hu_fc6f31b14fc2c78a.webp 1200w"
src="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/5_hu_bea289514f7b7a7a.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;It has been a great privilege to have prominent 3D GIS researchers, practitioners, and companies in Singapore for this event. Thanks everyone for attending.&lt;/p&gt;
&lt;p&gt;Please note that the event does not end here.
We are organising a &lt;a href="https://ual.sg/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/"&gt;special issue in Transactions in GIS &amp;lsquo;Emerging Topics in 3D GIS&amp;rsquo;&lt;/a&gt;, which is aimed at authors to extend their conference papers.
That said, the special issue is open to everyone in the 3D GIS/BIM community, as having a paper at the conference is not a requirement for submission.&lt;/p&gt;
&lt;p&gt;Next year&amp;rsquo;s event, &lt;a href="http://3dgeoinfo2020.com" target="_blank" rel="noopener"&gt;15th 3D Geoinfo 2020&lt;/a&gt;, will be hosted by the &lt;a href="https://www.ucl.ac.uk" target="_blank" rel="noopener"&gt;University College London&lt;/a&gt;.
Also, we already know the location of the 2021 event: New York City, hosted by the &lt;a href="https://cusp.nyu.edu" target="_blank" rel="noopener"&gt;New York Univesity&lt;/a&gt;.
See you!&lt;/p&gt;</description></item><item><title>Update on the GeoBIM benchmark</title><link>https://ual.sg/post/2019/09/27/update-on-the-geobim-benchmark/</link><pubDate>Fri, 27 Sep 2019 09:35:16 +0800</pubDate><guid>https://ual.sg/post/2019/09/27/update-on-the-geobim-benchmark/</guid><description>&lt;p&gt;We have been involved in the &lt;a href="https://3d.bk.tudelft.nl/projects/geobim-benchmark/" target="_blank" rel="noopener"&gt;ISPRS/EuroSDR GeoBIM benchmark&lt;/a&gt;, a study to investigate the state of the art of software adoption of IFC and CityGML.
The project is now halfway its timeline, so we have published a paper and delivered a keynote at the &lt;a href="https://www.3dgeoinfo2019.com" target="_blank" rel="noopener"&gt;2nd BIM/GIS Integration Workshop at the 3D Singapore 2019 event&lt;/a&gt; to give an update on the progress and preliminary results:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Noardo F, Biljecki F, Agugiaro G, Arroyo Ohori K, Ellul C, Harrie L, Stoter J (2019): GeoBIM benchmark 2019: intermediate results. &lt;em&gt;Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; XLII-4/W15:47–52. &lt;a href="https://doi.org/10.5194/isprs-archives-XLII-4-W15-47-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-XLII-4-W15-47-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-geobim-intermediate/2019-geobim-intermediate.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The abstract follows.&lt;/p&gt;
&lt;p&gt;An investigation into the implementation state of open standards in software is currently ongoing through the ISPRS/EuroSDR ‘GeoBIM benchmark 2019’ initiative, which kicked off earlier this year. The benchmark activity provides a way of assessing and comparing the functionality of different software packages in GIS and BIM in terms of their ability to handle standardised data (IFC and CityGML) and undertake various tasks using this data. Approximately 65 people have registered to participate so far, with participants from a wide range of backgrounds and proposing to test a variety of software packages. This confirms that the issues under investigation are of interest, and also meets the wider benchmark aim of having a variety of participants, since the project is conceived as using a bottom-up approach with cross-disciplinary and cross-expertise participation. While full benchmark results are not due to be submitted until later this year, interim results have highlighted a number of common issues across multiple software packages, and a web meeting for participants held in July 2019 also led to some improvements in how the benchmark results are being captured.&lt;/p&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2019-geobim-intermediate/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).
There is still time to join the benchmark, with the deadline being at the end of October 2019.
For more information please check the &lt;a href="https://3d.bk.tudelft.nl/projects/geobim-benchmark/" target="_blank" rel="noopener"&gt;website of the GeoBIM benchmark&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2019-geobim-intermediate/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/27/update-on-the-geobim-benchmark/page-one_hu_2442a65d6f0f6c7b.webp 400w,
/post/2019/09/27/update-on-the-geobim-benchmark/page-one_hu_53fbcea53e46a6f9.webp 760w,
/post/2019/09/27/update-on-the-geobim-benchmark/page-one_hu_4c68628ec8f961f9.webp 1200w"
src="https://ual.sg/post/2019/09/27/update-on-the-geobim-benchmark/page-one_hu_2442a65d6f0f6c7b.webp"
width="760"
height="497"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2019_geobim_intermediate&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Noardo, F and Biljecki, F and Agugiaro, G and Arroyo Ohori, K and Ellul, C and Harrie, L and Stoter, J}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{GeoBIM benchmark 2019: intermediate results}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2019}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{XLII-4/W15}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{47--52}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-archives-XLII-4-W15-47-2019}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Current trends, challenges, and gaps of contemporary 3D cadastre</title><link>https://ual.sg/post/2019/09/26/current-trends-challenges-and-gaps-of-contemporary-3d-cadastre/</link><pubDate>Thu, 26 Sep 2019 14:35:16 +0800</pubDate><guid>https://ual.sg/post/2019/09/26/current-trends-challenges-and-gaps-of-contemporary-3d-cadastre/</guid><description>&lt;p&gt;We published a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Stoter J, Ho S, Biljecki F (2019): Considerations for a contemporary 3D cadastre for our times. &lt;em&gt;Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; XLII-4/W15:81–88. &lt;a href="https://doi.org/10.5194/isprs-archives-XLII-4-W15-81-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-archives-XLII-4-W15-81-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-considerations-3-d-cadastre/2019-considerations-3-d-cadastre.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;A significant number of studies has been carried out to establish 3D cadastre solutions to improve the registration of multi-level property. Since the inception of research on 3D cadastres (about 20 years ago), the world around us has changed significantly and this also partly changes the context regarding 3D cadastre: technology (e.g. visualisation of 3D information), acquisition techniques and BIM data availability, and policy and organisational structures. This paper aims to explore the implications of these changes on 3D cadastre research with a view to discussing considerations for a contemporary 3D cadastre for our times. The paper draws on social and technical trends, challenges, and gaps around 3D cadastre practices from three jurisdictions: the Australian state of Victoria, the Netherlands, and Singapore. The cases have been selected as examples of well-functioning and highly trusted cadastres and land registries committed to innovation in this area, and whose practitioners and researchers are leading the research in this domain. This set provides a breadth of insight that informs our discussion. However, we acknowledge the limitations of the findings as the research undertaken in these jurisdictions is not complicated by other issues with registration or cadastres as they may occur in other countries.&lt;/p&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2019-considerations-3-d-cadastre/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2019-considerations-3-d-cadastre/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/26/current-trends-challenges-and-gaps-of-contemporary-3d-cadastre/page-one_hu_6c2136d361d470f3.webp 400w,
/post/2019/09/26/current-trends-challenges-and-gaps-of-contemporary-3d-cadastre/page-one_hu_95fdf84158bd0e7c.webp 760w,
/post/2019/09/26/current-trends-challenges-and-gaps-of-contemporary-3d-cadastre/page-one_hu_c5a0b2897f0882f1.webp 1200w"
src="https://ual.sg/post/2019/09/26/current-trends-challenges-and-gaps-of-contemporary-3d-cadastre/page-one_hu_6c2136d361d470f3.webp"
width="760"
height="453"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2019_considerations_3d_cadastre&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Stoter, J and Ho, S and Biljecki, F}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Considerations for a contemporary 3D cadastre for our times}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2019}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{XLII-4/W15}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{81--88}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-archives-XLII-4-W15-81-2019}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Paper on quality of BIM-GIS conversion</title><link>https://ual.sg/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/</link><pubDate>Thu, 26 Sep 2019 08:35:16 +0800</pubDate><guid>https://ual.sg/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/</guid><description>
&lt;figure id="figure-combination-of-different-types-of-errors-results-in-multiple-categories-that-are-depending-on-the-use-case-context-source-of-the-dataset-with-modifications-used-to-generate-the-illustration-institute-for-applied-computer-science-karlsruhe-institute-of-technology-häfele-2011"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Combination of different types of errors results in multiple categories that are depending on the use case context. Source of the dataset (with modifications) used to generate the illustration: Institute for Applied Computer Science, Karlsruhe Institute of Technology (Häfele, 2011)." srcset="
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/featured_hu_20fe0b0c779cddf.webp 400w,
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/featured_hu_80eabb2d3949208f.webp 760w,
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/featured_hu_1df59f9c234a5220.webp 1200w"
src="https://ual.sg/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/featured_hu_20fe0b0c779cddf.webp"
width="760"
height="402"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Combination of different types of errors results in multiple categories that are depending on the use case context. Source of the dataset (with modifications) used to generate the illustration: Institute for Applied Computer Science, Karlsruhe Institute of Technology (Häfele, 2011).
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;We published a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Tauscher H (2019): Quality of BIM-GIS conversion. &lt;em&gt;ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; IV-4/W8:35–42. &lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-35-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-IV-4-W8-35-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-bim-gis-quality/2019-bim-gis-quality.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Much work has been done on quality of geoinformation and interoperability between BIM and GIS.
However, the intersection of the two - quality control of the conversion between BIM and GIS - remains uncharted.
This discussion paper, based on empirical results, is one of the first steps towards mapping out a framework on errors and quality control in the context of BIM–GIS interoperability.
In our work we focus on the conversion from IFC to CityGML, identifying several systematic errors potentially common and/or exclusive to the context of BIM–GIS conversion.
Besides exposing several faults pertaining to IFC-sourced 3D city models, we discuss their taxonomy and their potential impact when engaged in applications.
This paper is also relevant with respect to the growing popularity of conversion between IFC and CityGML, potentially aiding others to avoid many of the errors that can occur in the process and establishing directions to set up a benchmark to assess the performance of the interoperability workflows.&lt;/p&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2019-bim-gis-quality/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2019-bim-gis-quality/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/page-one_hu_ca8a00aaaf9b1c8d.webp 400w,
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/page-one_hu_b412b49284bce69.webp 760w,
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/page-one_hu_ab57b9546d4f28.webp 1200w"
src="https://ual.sg/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/page-one_hu_ca8a00aaaf9b1c8d.webp"
width="760"
height="471"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2019_bim_gis_quality&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Biljecki, F and Tauscher, H}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-IV-4-W8-35-2019}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{35--42}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{{Quality of BIM--GIS conversion}}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{IV-4/W8}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2019}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;figure id="figure-citygml-dataset-obtained-from-ifc-storey-of-one-building-with-the-approach-developed-within-our-project-source-of-the-input-ifc-dataset-bca-singapore"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="CityGML dataset obtained from IFC (storey of one building) with the approach developed within our project. Source of the input IFC dataset: BCA Singapore." srcset="
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/c1-CityGML-3rd-2_cut_hu_158aede98c243af0.webp 400w,
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/c1-CityGML-3rd-2_cut_hu_96a0ec81387a67d1.webp 760w,
/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/c1-CityGML-3rd-2_cut_hu_142b64f6571ffd6a.webp 1200w"
src="https://ual.sg/post/2019/09/26/paper-on-quality-of-bim-gis-conversion/c1-CityGML-3rd-2_cut_hu_158aede98c243af0.webp"
width="760"
height="272"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
CityGML dataset obtained from IFC (storey of one building) with the approach developed within our project. Source of the input IFC dataset: BCA Singapore.
&lt;/figcaption&gt;&lt;/figure&gt;</description></item><item><title>Towards generating LoD2 models without aerial surveys using machine learning</title><link>https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/</link><pubDate>Mon, 23 Sep 2019 18:35:16 +0800</pubDate><guid>https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/</guid><description>
&lt;figure id="figure-open-3d-city-model-of-hamburg-germany-in-lod2-including-roof-types-we-investigated-whether-we-can-infer-the-type-of-roof-without-traditional-approaches-such-as-photogrammetry"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Open 3D city model of Hamburg, Germany in LoD2 (including roof types). We investigated whether we can infer the type of roof without traditional approaches such as photogrammetry." srcset="
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/featured_hu_32ea759ab9a86436.webp 400w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/featured_hu_b2f2fdf160419a8c.webp 760w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/featured_hu_8bd3b129cd45a6d.webp 1200w"
src="https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/featured_hu_32ea759ab9a86436.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Open 3D city model of Hamburg, Germany in LoD2 (including roof types). We investigated whether we can infer the type of roof without traditional approaches such as photogrammetry.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;We published a new paper:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F, Dehbi Y (2019): Raise the roof: towards generating LoD2 models without aerial surveys using machine learning. &lt;em&gt;ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.&lt;/em&gt; IV-4/W8:27-34. &lt;a href="https://doi.org/10.5194/isprs-annals-IV-4-W8-27-2019" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-IV-4-W8-27-2019&lt;/a&gt; &lt;a href="https://ual.sg/publication/2019-inferring-roof-type/2019-inferring-roof-type.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The paper deals with inferring the type of roof from LoD1 models, potentially contributing to a broader context of a workflow of generating LoD2 models bypassing traditional approaches.&lt;/p&gt;
&lt;p&gt;The work is a continuation of the &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2017.01.001" target="_blank" rel="noopener"&gt;previous work on generating LoD1 models from building footprints without elevation measurements&lt;/a&gt;, by inferring the heights of buildings solely from 2D data (using Random Forest regression).&lt;/p&gt;
&lt;p&gt;We have used 10 different predictors, e.g. building function, building height, and number of OpenStreetMap amenities in the neighbourhood, some of which are more useful than others.
For example, the vertical extent of the building (storeys and height) may give a good indication about the roof type:&lt;/p&gt;
&lt;figure id="figure-the-vertical-extent-of-the-building-part-gives-an-indication-of-the-roof-type-the-plot-shows-a-01-random-subset-of-our-test-dataset"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="The vertical extent of the building part gives an indication of the roof type. The plot shows a 0.1% random subset of our test dataset." srcset="
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roof_types_vertical_hist_hu_727b35c550400209.webp 400w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roof_types_vertical_hist_hu_255a2d47aaadfb3a.webp 760w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roof_types_vertical_hist_hu_6ad4ca3d37bbda8a.webp 1200w"
src="https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roof_types_vertical_hist_hu_727b35c550400209.webp"
width="760"
height="580"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
The vertical extent of the building part gives an indication of the roof type. The plot shows a 0.1% random subset of our test dataset.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;For training and testing purposes we have used the 3D city model of Hamburg, Germany.
In the end we have achieved an accuracy of 85% in predicting the type of the roof.
We also carried out another classification for predicting whether a roof is flat or not (with 92% accuracy).&lt;/p&gt;
&lt;p&gt;The list of features we have used and their importance (in both determining the roof type and in the binary classification whether a roof is flat or non-flat) is included below:&lt;/p&gt;
&lt;figure id="figure-feature-importance-of-the-two-approaches-some-predictors-are-more-important-than-others"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Feature importance of the two approaches. Some predictors are more important than others." srcset="
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/feature_importance_comparison_hu_26ab373264327326.webp 400w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/feature_importance_comparison_hu_7ca9983c1a1792c7.webp 760w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/feature_importance_comparison_hu_75454222f7f50ae3.webp 1200w"
src="https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/feature_importance_comparison_hu_26ab373264327326.webp"
width="760"
height="567"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Feature importance of the two approaches. Some predictors are more important than others.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;While we are happy how this work turned out and we made a progress, there is still a lot to be done in this topic (e.g. reconstructing the geometry of the roof), which we intend to tackle in future work.
Our roadmap at the moment looks like this:&lt;/p&gt;
&lt;figure id="figure-proposed-pipeline-from-lod0-footprints-to-lod2-models-without-aerial-survey-measurements-for-previous-work-inferring-heights-of-buildings-from-footprints-generating-lod1-models-see-biljecki-et-al-2017httpsdoiorg101016jcompenvurbsys201701001"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Proposed pipeline from LoD0 footprints to LoD2 models without aerial survey measurements. For previous work (inferring heights of buildings from footprints, generating LoD1 models) see ([Biljecki et al., 2017](https://doi.org/10.1016/j.compenvurbsys.2017.01.001))." srcset="
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roadmap_hu_1d022ef048ee8114.webp 400w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roadmap_hu_f05e19e2e943ab5e.webp 760w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roadmap_hu_63c034dbd68797e3.webp 1200w"
src="https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/roadmap_hu_1d022ef048ee8114.webp"
width="760"
height="138"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Proposed pipeline from LoD0 footprints to LoD2 models without aerial survey measurements. For previous work (inferring heights of buildings from footprints, generating LoD1 models) see (&lt;a href="https://doi.org/10.1016/j.compenvurbsys.2017.01.001" target="_blank" rel="noopener"&gt;Biljecki et al., 2017&lt;/a&gt;).
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;For more information please see the &lt;a href="https://ual.sg/publication/2019-inferring-roof-type/"&gt;paper&lt;/a&gt; (open access &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://ual.sg/publication/2019-inferring-roof-type/"&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/page-one_hu_138d4c7456868156.webp 400w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/page-one_hu_85492ce57e98a32.webp 760w,
/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/page-one_hu_959a756c6891a610.webp 1200w"
src="https://ual.sg/post/2019/09/23/towards-generating-lod2-models-without-aerial-surveys-using-machine-learning/page-one_hu_138d4c7456868156.webp"
width="760"
height="535"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;BibTeX citation:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-bibtex" data-lang="bibtex"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="nc"&gt;@article&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;2019_inferring_roof_type&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;author&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Biljecki, F and Dehbi, Y}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;doi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{10.5194/isprs-annals-IV-4-W8-27-2019}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;journal&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;pages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{27--34}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;title&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{Raise the roof: towards generating LoD2 models without aerial surveys using machine learning}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;volume&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{IV-4/W8}&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; &lt;span class="na"&gt;year&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;{2019}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description></item><item><title>Special issue in Transactions in GIS on 3D city modelling and BIM</title><link>https://ual.sg/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/</link><pubDate>Thu, 12 Sep 2019 16:43:45 +0800</pubDate><guid>https://ual.sg/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/featured_hu_8d8288a12324274d.webp 400w,
/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/featured_hu_d55eafc3146e1a83.webp 760w,
/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/featured_hu_ed14113d88409860.webp 1200w"
src="https://ual.sg/post/2019/09/12/special-issue-in-transactions-in-gis-on-3d-city-modelling-and-bim/featured_hu_8d8288a12324274d.webp"
width="760"
height="382"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;div class="alert alert-note"&gt;
&lt;div&gt;
The special issue has been published in February 2021.
Please see the &lt;a href="https://ual.sg/post/2021/02/17/publication-of-the-collection-emerging-topics-in-3d-gis/"&gt;related blog post&lt;/a&gt;.
Thanks to all the authors for submitting their contributions.
&lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;As part of the &lt;a href="https://ual.sg/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/"&gt;3D GeoInfo 2019 conference, and the 2th BIM/GIS Integration Workshop&lt;/a&gt;, we are organising a special issue in &lt;a href="https://onlinelibrary.wiley.com/journal/14679671" target="_blank" rel="noopener"&gt;&lt;em&gt;Transactions in GIS&lt;/em&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Please find the call for papers below.
It is important to note that this special issue is open to everyone regardless of conference or workshop participation, and projects and topics not presented at the event are welcome as well.&lt;/p&gt;
&lt;p&gt;Transactions in GIS is a highly ranked journal in its subject area, and there are no publication fees involved (unless you opt for open access).&lt;/p&gt;
&lt;h2 id="call-for-papers"&gt;Call for Papers&lt;/h2&gt;
&lt;h2 id="transactions-in-gis"&gt;Transactions in GIS&lt;/h2&gt;
&lt;h2 id="special-issue-emerging-topics-in-3d-gis"&gt;Special Issue: Emerging topics in 3D GIS&lt;/h2&gt;
&lt;p&gt;This special issue focuses on the latest developments and applications in advanced 3D data and technologies.
It is primarily aimed at participants of the 3D GeoInfo 2019 Conference and the 2th BIM/GIS Integration Workshop as an opportunity to extend their research papers published at the conference and workshop.
However, the special issue is also open to everyone in the 3D city modelling and BIM community, as the participation and publication at conference &amp;amp; workshop are not a requirement.&lt;/p&gt;
&lt;p&gt;We welcome high-quality contributions proposing solutions and approaches in the domain of the following topics:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;3D geoinformation requirements&lt;/li&gt;
&lt;li&gt;3D data acquisition and processing&lt;/li&gt;
&lt;li&gt;3D city modelling and its standards&lt;/li&gt;
&lt;li&gt;4D/5D modelling, GeoBIM&lt;/li&gt;
&lt;li&gt;3D geometry, topology and semantics&lt;/li&gt;
&lt;li&gt;Visualisation and dissemination of 3D data&lt;/li&gt;
&lt;li&gt;3D GIS, spatial analysis and its applications (cadastre, utilities, infrastructure, navigation, planning, geology, disaster and risk management, archaeology, marine systems, and simulations)&lt;/li&gt;
&lt;li&gt;Big data and GIS-based urban analytics&lt;/li&gt;
&lt;li&gt;Legal and institutional considerations&lt;/li&gt;
&lt;li&gt;Integrated collaborative environments&lt;/li&gt;
&lt;li&gt;Standards in BIM and GIS&lt;/li&gt;
&lt;li&gt;Level of detail and level of development&lt;/li&gt;
&lt;li&gt;Interoperability and geo-referencing&lt;/li&gt;
&lt;li&gt;Integration for Decision Science and Risks&lt;/li&gt;
&lt;li&gt;Automatic change analysis between BIM and GIS models&lt;/li&gt;
&lt;li&gt;3D visualisation&lt;/li&gt;
&lt;li&gt;Virtual design and construction&lt;/li&gt;
&lt;li&gt;Virtual reality and augmented reality&lt;/li&gt;
&lt;li&gt;Algorithms to generate BIM/GIS models from point cloud data&lt;/li&gt;
&lt;li&gt;BIM and GIS integration with 3D point clouds&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="submission-process"&gt;Submission process&lt;/h2&gt;
&lt;p&gt;The manuscripts will be subject to the regular Transactions in GIS peer review process.
&lt;mark&gt;The submission deadline is &lt;del&gt;1 March 2020&lt;/del&gt; 15 March 2020&lt;/mark&gt;.
Please prepare the papers according to the journal&amp;rsquo;s &lt;a href="https://onlinelibrary.wiley.com/page/journal/14679671/homepage/forauthors.html" target="_blank" rel="noopener"&gt;author guidelines&lt;/a&gt;.
The manuscript has to be submitted through the &lt;a href="https://mc.manuscriptcentral.com/tgis" target="_blank" rel="noopener"&gt;usual channel of the journal&lt;/a&gt;, and during the submission please indicate that the submission is for a special issue: in step 6 toggle the corresponding option and write the title of the SI:
&lt;mark&gt;Emerging topics in 3D GIS&lt;/mark&gt;.&lt;/p&gt;
&lt;h2 id="schedule"&gt;Schedule&lt;/h2&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Stage&lt;/th&gt;
&lt;th&gt;Date&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Submission deadline&lt;/td&gt;
&lt;td&gt;&lt;del&gt;1 March 2020&lt;/del&gt; 15 March 2020&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;First round notification&lt;/td&gt;
&lt;td&gt;1 June 2020&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Revision due&lt;/td&gt;
&lt;td&gt;1 July 2020&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Final notification&lt;/td&gt;
&lt;td&gt;1 September 2020&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Final manuscript due&lt;/td&gt;
&lt;td&gt;15 September 2020&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Tentative online publication&lt;/td&gt;
&lt;td&gt;1 December 2020&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Expected publication of the special issue&lt;/td&gt;
&lt;td&gt;Early 2021&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 id="guest-editors"&gt;Guest editors&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Filip Biljecki, National University of Singapore, Singapore&lt;/li&gt;
&lt;li&gt;Rudi Stouffs, National University of Singapore, Singapore&lt;/li&gt;
&lt;li&gt;Mohsen Kalantari, University of Melbourne, Australia&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Visit by students from HFT Stuttgart</title><link>https://ual.sg/post/2019/08/31/visit-by-students-from-hft-stuttgart/</link><pubDate>Sat, 31 Aug 2019 18:20:18 +0800</pubDate><guid>https://ual.sg/post/2019/08/31/visit-by-students-from-hft-stuttgart/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="HFT Stuttgart visit to SDE" srcset="
/post/2019/08/31/visit-by-students-from-hft-stuttgart/1_hu_e41bedf2f75fb577.webp 400w,
/post/2019/08/31/visit-by-students-from-hft-stuttgart/1_hu_936c656190896774.webp 760w,
/post/2019/08/31/visit-by-students-from-hft-stuttgart/1_hu_ed32f24a1d7a78f7.webp 1200w"
src="https://ual.sg/post/2019/08/31/visit-by-students-from-hft-stuttgart/1_hu_e41bedf2f75fb577.webp"
width="760"
height="507"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;We are pleased to have been visited by a group of master students studying at the &lt;a href="https://www.hft-stuttgart.de" target="_blank" rel="noopener"&gt;Stuttgart Technology University of Applied Sciences (Hochschule für Technik)&lt;/a&gt; in Germany.
The visit was accompanied by &lt;a href="https://scholar.google.com/citations?user=WHwZRQ0AAAAJ&amp;amp;hl=en" target="_blank" rel="noopener"&gt;Prof Alias Abdul Rahman&lt;/a&gt; (&lt;a href="https://www.utm.my" target="_blank" rel="noopener"&gt;Universiti Teknologi Malaysia - UTM&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;The students are spending their summer on a study trip in Southeast Asia, and during their visit to Singapore they learned more about NUS, and GIS and Geomatics research and teaching activities of our lab.&lt;/p&gt;
&lt;p&gt;The group was also given a tour of &lt;a href="https://www.arch.nus.edu.sg/about/facilities/net-zero-energy-building-sde-4/" target="_blank" rel="noopener"&gt;SDE4&lt;/a&gt; - the new building of our school, notable for being &lt;a href="https://www.straitstimes.com/singapore/environment/nus-launches-singapores-first-net-zero-energy-building-to-be-built-from" target="_blank" rel="noopener"&gt;the first net-zero energy building in Singapore&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Release of 3D building open data of HDBs in Singapore</title><link>https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/</link><pubDate>Sun, 25 Aug 2019 09:19:45 +0800</pubDate><guid>https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/featured_hu_26877e923249e9ac.webp 400w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/featured_hu_61a107ed6762808f.webp 760w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/featured_hu_3369fab9b4628670.webp 1200w"
src="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/featured_hu_26877e923249e9ac.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;Cities around the world are increasingly releasing their 3D city models as open data.
Researchers and practitioners in different disciplines are using 3D geoinformation to carry out &lt;a href="https://doi.org/10.3390/ijgi4042842" target="_blank" rel="noopener"&gt;a variety of spatial analyses&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;At NUS we have been working on generating a 3D city model of all public housing (HDB) buildings in Singapore, by conflating different open datasets.
Public housing accommodates the predominant majority of Singapore&amp;rsquo;s population, so the dataset covers most of the nation&amp;rsquo;s residential buildings.&lt;/p&gt;
&lt;p&gt;We are happy to announce that we are &lt;a href="https://github.com/ualsg/hdb3d-data" target="_blank" rel="noopener"&gt;releasing its first version as open data&lt;/a&gt;, hopefully benefiting researchers who previously did not have access to such data.
To the extent of our knowledge, this is the first large-scale open dataset of 3D buildings in Singapore.&lt;/p&gt;
&lt;p&gt;We offer two formats:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.cityjson.org" target="_blank" rel="noopener"&gt;CityJSON&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Wavefront_.obj_file" target="_blank" rel="noopener"&gt;OBJ&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Both can be downloaded on our &lt;a href="https://github.com/ualsg/hdb3d-data" target="_blank" rel="noopener"&gt;Github repository&lt;/a&gt;.
While the dataset covers almost all buildings and it is semantically rich, please note that this is the first version and it is still work in progress with errors (e.g. some locations are wrongly associated due to addressing issues) and with a long todo list, please read more about the known issues at our Github &lt;a href="https://github.com/ualsg/hdb3d-data" target="_blank" rel="noopener"&gt;repo&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The dataset can also be viewed through Mapbox:&lt;/p&gt;
&lt;script src='https://api.tiles.mapbox.com/mapbox-gl-js/v1.2.1/mapbox-gl.js'&gt;&lt;/script&gt;
&lt;link href='https://api.tiles.mapbox.com/mapbox-gl-js/v1.2.1/mapbox-gl.css' rel='stylesheet' /&gt;
&lt;div id='map'&gt;&lt;/div&gt;
&lt;script&gt;
mapboxgl.accessToken = 'pk.eyJ1IjoiZmlsaXBiIiwiYSI6ImJtaER3cUEifQ.PsDMGtN7hWbbZRON0HveLQ';
var map = new mapboxgl.Map({
style: 'mapbox://styles/filipb/cjzqf8edw0j171ct08lpcx1hr',
center: [103.724315, 1.349543],
zoom: 15.5,
pitch: 45,
bearing: -17.6,
container: 'map',
antialias: true,
attributionControl: false
}).addControl(new mapboxgl.AttributionControl({
customAttribution: "&amp;copy; NUS Urban Analytics Lab &amp;copy; HDB &amp;copy; OneMap"}));
&lt;/script&gt;
&lt;h3 id="method"&gt;Method&lt;/h3&gt;
&lt;p&gt;We have used extrusion, a straightforward and common way to generate 3D models.
It uses information about the height of each object (usually obtained from airborne laser scanning; lidar) and the geometry of the footprint.
This method results in &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2016.04.005" target="_blank" rel="noopener"&gt;LoD1 (block) models&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;While this method is simple and building footprints are nowadays widely available (we have used &lt;a href="https://www.openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap&lt;/a&gt;), it is hampered by the lack of open data on the heights of objects.
There is no publicly available lidar dataset covering Singapore.&lt;/p&gt;
&lt;p&gt;Therefore, as a proxy for the height, we have used the number of storeys of each block, which is available as &lt;a href="https://data.gov.sg/dataset/hdb-property-information" target="_blank" rel="noopener"&gt;open data in Singapore&lt;/a&gt;.
Although prone to errors, this method has been &lt;a href="https://doi.org/10.1016/j.compenvurbsys.2017.01.001" target="_blank" rel="noopener"&gt;very popular&lt;/a&gt; around the world and benefited a myriad of researchers.&lt;/p&gt;
&lt;h3 id="data-sources"&gt;Data sources&lt;/h3&gt;
&lt;p&gt;We have used the following datasets:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.openstreetmap.org" target="_blank" rel="noopener"&gt;2D building footprints from OpenStreetMap&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://data.gov.sg/dataset/hdb-property-information" target="_blank" rel="noopener"&gt;HDB Property Information&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;We linked the two using &lt;a href="https://docs.onemap.sg" target="_blank" rel="noopener"&gt;OneMap&amp;rsquo;s geocoder&lt;/a&gt;.
While OpenStreetMap&amp;rsquo;s completeness in Singapore is high and in most cases it is sufficient to generate a 3D model, height (and/or number storeys) and the address information are not available for all buildings, and thus HDB&amp;rsquo;s dataset was used along with OneMap&amp;rsquo;s API to link the two.
Furthermore, HDB&amp;rsquo;s dataset contains a lot of attributes for each block, not available in OpenStreetMap, greatly increasing the semantic richness of the dataset.
More information about the workflow is available at &lt;a href="https://github.com/ualsg/hdb3d-code" target="_blank" rel="noopener"&gt;our Github repo&lt;/a&gt;, where we have released the code as open-source as well.&lt;/p&gt;
&lt;h3 id="geometry"&gt;Geometry&lt;/h3&gt;
&lt;p&gt;There is not much to say about the geometry.
It is in LoD1, and thanks to the hard work of the OpenStreetMap community, the footprints look pretty good.
Have a look yourself:&lt;/p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c1_att_hu_1faf6bd8158b75d6.webp 400w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c1_att_hu_1fbae770ddffd0fb.webp 760w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c1_att_hu_1b8437a18fffe397.webp 1200w"
src="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c1_att_hu_1faf6bd8158b75d6.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;div
style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;"&gt;
&lt;iframe
src="https://player.vimeo.com/video/355799290?dnt=0"
style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allow="fullscreen"&gt;
&lt;/iframe&gt;
&lt;/div&gt;
&lt;h3 id="attributes"&gt;Attributes&lt;/h3&gt;
&lt;p&gt;The dataset is semantically rich thanks to coupling the two datasets (&lt;a href="https://www.openstreetmap.org" target="_blank" rel="noopener"&gt;OpenStreetMap&lt;/a&gt; and the &lt;a href="https://data.gov.sg/dataset/hdb-property-information" target="_blank" rel="noopener"&gt;Housing and Development Board open data on property information&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;The resulting CityJSON dataset includes several attributes, such as address and number of units in each block (even decomposed by number of rooms thanks to the open data of HDB).
Check the left pane of the screenshot from &lt;a href="https://itunes.apple.com/nl/app/azul/id1173239678?mt=12" target="_blank" rel="noopener"&gt;azul&lt;/a&gt; for an example block in Bukit Merah:&lt;/p&gt;
&lt;figure id="figure-data-viewed-in-azul-free-macos-software-for-cityjson-showing-the-attributes-for-a-block-in-bukit-merah"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Data viewed in azul (free macOS software for CityJSON) showing the attributes for a block in Bukit Merah." srcset="
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/azul-hdb_hu_869f45abf7782da4.webp 400w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/azul-hdb_hu_56ff304f10e8f9e9.webp 760w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/azul-hdb_hu_4674e81a4aa33d63.webp 1200w"
src="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/azul-hdb_hu_869f45abf7782da4.webp"
width="760"
height="495"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Data viewed in azul (free macOS software for CityJSON) showing the attributes for a block in Bukit Merah.
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;p&gt;For the list of attributes please scroll down to the metadata section.&lt;/p&gt;
&lt;p&gt;Please note that the OBJ file contains only the geometry since it&amp;rsquo;s a 3D computer graphics format.&lt;/p&gt;
&lt;h3 id="download"&gt;Download&lt;/h3&gt;
&lt;p&gt;If all this sounds good to you, the dataset is available on our &lt;a href="https://github.com/ualsg/hdb3d-data" target="_blank" rel="noopener"&gt;&lt;i class="fab fa-github"&gt;&lt;/i&gt; Github repo&lt;/a&gt;.
The datasets are below 100MB in size thanks to the compactness of the CityJSON and OBJ formats.&lt;/p&gt;
&lt;mark&gt;Terms of use: if using the data, please mention the following data sources: NUS Urban Analytics Lab, HDB Singapore, OpenStreetMap contributors, and OneMap.
If you are using it for a nice publication, please cite the following &lt;a href="https://doi.org/10.5194/isprs-annals-VI-4-W1-2020-37-2020" target="_blank" rel="noopener"&gt;paper&lt;/a&gt;:&lt;/mark&gt;
&lt;div class="highlight"&gt;&lt;pre tabindex="0" class="chroma"&gt;&lt;code class="language-fallback" data-lang="fallback"&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;@article{2020_3dgeoinfo_3d_asean,
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; author = {Biljecki, F},
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; doi = {10.5194/isprs-annals-vi-4-w1-2020-37-2020},
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; journal = {ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences},
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; pages = {37--44},
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; title = {{Exploration of open data in Southeast Asia to generate 3D building models}},
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; volume = {VI-4/W1-2020},
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt; year = {2020}
&lt;/span&gt;&lt;/span&gt;&lt;span class="line"&gt;&lt;span class="cl"&gt;}
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Feel free to drop us an email if you do something interesting with the dataset.&lt;/p&gt;
&lt;h3 id="code"&gt;Code&lt;/h3&gt;
&lt;p&gt;The code used to generate the data is &lt;a href="https://github.com/ualsg/hdb3d-code" target="_blank" rel="noopener"&gt;available as open-source&lt;/a&gt; as well.&lt;/p&gt;
&lt;h3 id="metadata"&gt;Metadata&lt;/h3&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Key&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Coordinate system&lt;/td&gt;
&lt;td&gt;SVY21 / Singapore TM (&lt;a href="https://epsg.io/3414" target="_blank" rel="noopener"&gt;EPSG:3414&lt;/a&gt;)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Unit&lt;/td&gt;
&lt;td&gt;m&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Number of buildings&lt;/td&gt;
&lt;td&gt;12119&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://doi.org/10.1016/j.compenvurbsys.2016.04.005" target="_blank" rel="noopener"&gt;Level of Detail&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;1.2&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;geographicalExtent&lt;/td&gt;
&lt;td&gt;[11474, 28055, 0, 45327, 48759, 142]&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;geographicLocation&lt;/td&gt;
&lt;td&gt;Singapore, Republic of Singapore&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Dataset version&lt;/td&gt;
&lt;td&gt;2019-08-25&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;OSM input data version&lt;/td&gt;
&lt;td&gt;2019-07-18&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;HDB input data version&lt;/td&gt;
&lt;td&gt;2019-07-05&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;As shown above, the dataset is semantically quite rich thanks to the input datasets.
The attributes from OpenStreetMap are prefixed with &lt;code&gt;osm_&lt;/code&gt;, while the ones from the HDB dataset are prefixed with &lt;code&gt;hdb_&lt;/code&gt;.
Please note that some information is duplicated in both datasets.
We have included both sets of information just in case.
The list of the usual attributes is as follows (but not limited to):&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Key&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;th&gt;Example&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;osm_id&lt;/td&gt;
&lt;td&gt;ID of the geometry in OpenStreetMap&lt;/td&gt;
&lt;td&gt;&lt;code&gt;way/440545194&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;osm_timestamp&lt;/td&gt;
&lt;td&gt;Time of update of the 2D geometry in OSM&lt;/td&gt;
&lt;td&gt;&lt;code&gt;2016-09-04T05:06:21&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;osm_building&lt;/td&gt;
&lt;td&gt;General tag for buildings&lt;/td&gt;
&lt;td&gt;&lt;code&gt;residential&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;osm_addr_city&lt;/td&gt;
&lt;td&gt;&lt;a href="https://wiki.openstreetmap.org/wiki/Key:addr" target="_blank" rel="noopener"&gt;Various address information&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Singapore&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;osm_residential&lt;/td&gt;
&lt;td&gt;A general tag for additional information&lt;/td&gt;
&lt;td&gt;&lt;code&gt;HDB&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&amp;hellip;&lt;/td&gt;
&lt;td&gt;&amp;hellip;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_blk_no&lt;/td&gt;
&lt;td&gt;HDB block number&lt;/td&gt;
&lt;td&gt;&lt;code&gt;95B&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_street&lt;/td&gt;
&lt;td&gt;Street name&lt;/td&gt;
&lt;td&gt;&lt;code&gt;HENDERSON ROAD&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_residential&lt;/td&gt;
&lt;td&gt;Residential building (Boolean)&lt;/td&gt;
&lt;td&gt;&lt;code&gt;Y&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_year_completed&lt;/td&gt;
&lt;td&gt;Year of completion&lt;/td&gt;
&lt;td&gt;&lt;code&gt;2018&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_bldg_contract_town&lt;/td&gt;
&lt;td&gt;Town&lt;/td&gt;
&lt;td&gt;&lt;code&gt;BM&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_total_dwelling_units&lt;/td&gt;
&lt;td&gt;Number of units&lt;/td&gt;
&lt;td&gt;&lt;code&gt;286&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;hdb_4room_sold&lt;/td&gt;
&lt;td&gt;Number of 4-room sold flats&lt;/td&gt;
&lt;td&gt;&lt;code&gt;104&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&amp;hellip;&lt;/td&gt;
&lt;td&gt;&amp;hellip;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;height&lt;/td&gt;
&lt;td&gt;Estimated height in m&lt;/td&gt;
&lt;td&gt;&lt;code&gt;113.3&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;For the metadata sourced from the &lt;a href="https://data.gov.sg/dataset/hdb-property-information" target="_blank" rel="noopener"&gt;HDB Property Information&lt;/a&gt; you may want to check the information found in the original dataset.
For OSM metadata, there is also a &lt;a href="https://wiki.openstreetmap.org/wiki/Key:building" target="_blank" rel="noopener"&gt;dedicated page&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The dataset includes all types of HDB buildings, incl. commercial and carparks.
There are around 12 thousand HDB buildings in Singapore:&lt;/p&gt;
&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="" srcset="
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c3_att_hu_2a750994b57019cd.webp 400w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c3_att_hu_7745da5618c1b3b1.webp 760w,
/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c3_att_hu_2ae6f1c813276f8f.webp 1200w"
src="https://ual.sg/post/2019/08/25/release-of-3d-building-open-data-of-hdbs-in-singapore/hdb3d-c3_att_hu_2a750994b57019cd.webp"
width="760"
height="428"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
More technical details are available on our Github repositories for &lt;a href="https://github.com/ualsg/hdb3d-data" target="_blank" rel="noopener"&gt;data&lt;/a&gt; and &lt;a href="https://github.com/ualsg/hdb3d-code" target="_blank" rel="noopener"&gt;code&lt;/a&gt;.&lt;/p&gt;
&lt;h3 id="further-reading-updated-september-2020"&gt;Further reading (updated September 2020)&lt;/h3&gt;
&lt;p&gt;A &lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-3-d-asean/"&gt;paper&lt;/a&gt; has been published:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Biljecki F (2020): Exploration of open data in Southeast Asia to generate 3D building models. &lt;em&gt;ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences.&lt;/em&gt; VI-4/W1-2020: 37-44. &lt;a href="https://doi.org/10.5194/isprs-annals-vi-4-w1-2020-37-2020" target="_blank" rel="noopener"&gt;&lt;i class="ai ai-doi-square ai"&gt;&lt;/i&gt; 10.5194/isprs-annals-vi-4-w1-2020-37-2020&lt;/a&gt; &lt;a href="https://ual.sg/publication/2020-3-dgeoinfo-3-d-asean/2020-3-dgeoinfo-3-d-asean.pdf"&gt;&lt;i class="far fa-file-pdf"&gt;&lt;/i&gt; PDF&lt;/a&gt; &lt;i class="ai ai-open-access-square ai"&gt;&lt;/i&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h3 id="point-of-contact"&gt;Point of contact&lt;/h3&gt;
&lt;p&gt;Feel free to get in touch if you have questions or use the data for an interesting purpose.
If you find an error or have a suggestion, Github issues are a preferred mode of communication.&lt;/p&gt;
&lt;h3 id="nus-students"&gt;NUS students&lt;/h3&gt;
&lt;p&gt;Are you an NUS student who is interested working on topics such as this one?
There are related &lt;a href="https://ual.sg/opportunities/student-projects"&gt;master thesis topics&lt;/a&gt; that may be of your interest.&lt;/p&gt;</description></item><item><title>PhD and Postdoc fellowship opportunity (Faculty for the Future Fellowships)</title><link>https://ual.sg/post/2019/08/19/phd-and-postdoc-fellowship-opportunity-faculty-for-the-future-fellowships/</link><pubDate>Mon, 19 Aug 2019 19:51:36 +0800</pubDate><guid>https://ual.sg/post/2019/08/19/phd-and-postdoc-fellowship-opportunity-faculty-for-the-future-fellowships/</guid><description>&lt;p&gt;The Schlumberger Foundation is accepting new applications for the 2020–2021 &lt;a href="https://www.facultyforthefuture.net" target="_blank" rel="noopener"&gt;Faculty for the Future Fellowships&lt;/a&gt; from September 5th to November 7th, 2019.
The program’s long-term goal is to generate conditions that result in more women pursuing scientific
careers by lowering the barriers women face when entering STEM disciplines, thus reducing the gender
gap.&lt;/p&gt;
&lt;p&gt;The fellowship is open to female applicants from developing countries and emerging economies.&lt;/p&gt;
&lt;p&gt;If you are interested in topics such as urban analytics, GIS, and 3D city modelling and if you are &lt;a href="https://www.facultyforthefuture.net/content/grant-application-process" target="_blank" rel="noopener"&gt;eligible&lt;/a&gt; to apply, we are open in working together on your application.
Feel free to &lt;a href="https://ual.sg/#contact"&gt;contact us&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>3D GeoInfo 2019 conference in Singapore</title><link>https://ual.sg/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/</link><pubDate>Sun, 28 Jul 2019 10:35:33 +0800</pubDate><guid>https://ual.sg/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/</guid><description>&lt;p&gt;
&lt;figure &gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="3D GeoInfo 2019 Conference Flyer" srcset="
/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/3D-GeoInfo-2019-Conference-Flyer_hu_b231777cadd7fd33.webp 400w,
/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/3D-GeoInfo-2019-Conference-Flyer_hu_297c0fb81e8cbfdc.webp 760w,
/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/3D-GeoInfo-2019-Conference-Flyer_hu_558e793cabbbd954.webp 1200w"
src="https://ual.sg/post/2019/07/28/3d-geoinfo-2019-conference-in-singapore/3D-GeoInfo-2019-Conference-Flyer_hu_b231777cadd7fd33.webp"
width="536"
height="760"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;/figure&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.3dgeoinfo2019.com/" target="_blank" rel="noopener"&gt;3D GeoInfo&lt;/a&gt; is the flagship conference of the international 3D city modelling community, with the &lt;a href="https://3d.bk.tudelft.nl/events/3dgeoinfo2018/" target="_blank" rel="noopener"&gt;2018 edition held in Delft&lt;/a&gt;.
This year we are quite fortunate to be able to host it in Singapore, together with the &lt;a href="http://sla.gov.sg" target="_blank" rel="noopener"&gt;Singapore Land Authority&lt;/a&gt;, in the last week of September.&lt;/p&gt;
&lt;p&gt;The paper submission and peer review process was a competitive one: we have received 68 papers, but were able to accept only 39.
Besides an attractive selection of papers (for the full list please see the &lt;a href="https://www.3dgeoinfo2019.com/" target="_blank" rel="noopener"&gt;conference website&lt;/a&gt;) and exhibitors, we have been able to secure a very nice venue - the &lt;a href="https://www.acm.org.sg/" target="_blank" rel="noopener"&gt;Asian Civilisations Museum (ACM)&lt;/a&gt; located in downtown Singapore.&lt;/p&gt;
&lt;p&gt;Even though the paper submission deadline passed, there is still a chance to participate: make sure to &lt;a href="https://events.miceneurol.com/3d-geoinfo-2019/register/Site/Register" target="_blank" rel="noopener"&gt;register&lt;/a&gt; before 15 August.&lt;/p&gt;
&lt;p&gt;Please note that this edition of 3D GeoInfo is part of a larger event called 3D Singapore, which also includes the pre-conference 2th BIM/GIS Integration Workshop, the Point Clouds Training and the Big Data and Urban Analytics Workshop.&lt;/p&gt;
&lt;p&gt;For more information, including the tentative programme, please visit the &lt;a href="https://www.3dgeoinfo2019.com/" target="_blank" rel="noopener"&gt;conference website&lt;/a&gt;.
See you at the ACM in September!&lt;/p&gt;
&lt;div class="alert alert-note"&gt;
&lt;div&gt;
Update 2019-10-08: The conference has been a success! Read the report &lt;a href="https://ual.sg/post/2019/09/28/3d-geoinfo-2019-in-singapore-a-success.-thanks-everyone/" target="_blank" rel="noopener"&gt;here&lt;/a&gt;. Thanks everyone!
&lt;/div&gt;
&lt;/div&gt;</description></item><item><title>Li Jialin defends her thesis on analysing the influence of short-term rental business on housing prices</title><link>https://ual.sg/post/2019/07/22/li-jialin-defends-her-thesis-on-analysing-the-influence-of-short-term-rental-business-on-housing-prices/</link><pubDate>Mon, 22 Jul 2019 13:59:25 +0800</pubDate><guid>https://ual.sg/post/2019/07/22/li-jialin-defends-her-thesis-on-analysing-the-influence-of-short-term-rental-business-on-housing-prices/</guid><description>&lt;p&gt;Master of Urban Planning student &lt;a href="https://ual.sg/author/jialin-li/"&gt;Jialin Li&lt;/a&gt; successfully defended her thesis &lt;em&gt;The Implementation of Big Data Analysis in Regulating Online Short-Term Rental Business: A case of Airbnb in Beijing&lt;/em&gt;.
The abstract follows:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Since its founding in 2009, Airbnb has grown into one of the most successful sharing economy company in the world. Expanding its presence in 65,000 cities and 191 countries within 9 years, Airbnb has quietly reshaped our cities. The aim of this thesis is to reveal how Airbnb is influencing Beijing’s neighbourhood housing price and where is the impact most prominent. Four machine learning models based on various algorithms are developed and trained to analyse the pattern behind housing prices. Random forest model is then selected to be model of the best fit and used to conduct sensitivity analysis on neighbourhood housing prices given a unit change in Airbnb activity. The spatial pattern of sensitivity results is then geo-visualized. The results revealed that neighbourhoods along axial roads connecting to core districts and locating in sub-centre of periphery districts are more likely to be price sensitive to Airbnb activity. In addition, this thesis discusses the possibility of using big data analysis and geo-visualization as tool kit for Airbnb regulation and attempts to respond to the findings through location- specific regulations. A series of policy recommendation is then given based on results of neighbourhood sensitivity scores.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;A conference paper summarising the work will be published in autumn.&lt;/p&gt;
&lt;p&gt;Congrats Jialin! &amp;#x1f44f; Thanks for the collaboration and we wish you all the best in the continuation of your career as planner at &lt;a href="https://surbanajurong.com" target="_blank" rel="noopener"&gt;Surbana Jurong&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>ISPRS Geospatial Week 2019</title><link>https://ual.sg/post/2019/06/24/isprs-geospatial-week-2019/</link><pubDate>Mon, 24 Jun 2019 07:39:14 +0800</pubDate><guid>https://ual.sg/post/2019/06/24/isprs-geospatial-week-2019/</guid><description>&lt;p&gt;The &lt;a href="https://www.gsw2019.org" target="_blank" rel="noopener"&gt;ISPRS Geospatial Week 2019&lt;/a&gt; was held in Enschede, the Netherlands.
As part of the &lt;a href="http://www2.isprs.org/commissions/comm4/wg10.html" target="_blank" rel="noopener"&gt;ISPRS WG IV/10 (Advanced Geospatial Applications for Smart Cities and Regions)&lt;/a&gt;, we have been involved in the organisation of the &lt;a href="https://www.gsw2019.org/smartgeoapps/" target="_blank" rel="noopener"&gt;ISPRS Workshop on Advanced Geospatial Applications for Smart Cities and Regions (SmartGeoApps 2019)&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The papers have been published in the &lt;a href="https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/IV-2-W5/index.html" target="_blank" rel="noopener"&gt;ISPRS Annals&lt;/a&gt; and the &lt;a href="https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-2-W13/index.html" target="_blank" rel="noopener"&gt;ISPRS Archives&lt;/a&gt; (our prefaces can be found &lt;a href="https://ual.sg/publication/2019-gsw-preface-smartgeoapps-annals/"&gt;here&lt;/a&gt; and &lt;a href="https://ual.sg/publication/2019-gsw-preface-smartgeoapps-arch/"&gt;here&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;We also published a collaborative &lt;a href="https://ual.sg/publication/2019-gsw-geobim-benchmark/"&gt;paper&lt;/a&gt; during the conference, about the preliminary results of the &lt;a href="https://3d.bk.tudelft.nl/projects/geobim-benchmark/" target="_blank" rel="noopener"&gt;GeoBIM benchmark&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Visits to Korea and Japan</title><link>https://ual.sg/post/2019/05/22/visits-to-korea-and-japan/</link><pubDate>Wed, 22 May 2019 17:53:40 +0800</pubDate><guid>https://ual.sg/post/2019/05/22/visits-to-korea-and-japan/</guid><description>&lt;p&gt;In May 2019 we have visited prominent institutions in the domain of GIS and smart city research in East Asia:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Pusan National University (Prof Ki-Joune Li) 🇰🇷&lt;/li&gt;
&lt;li&gt;University of Seoul (Prof Jiyeong Lee) 🇰🇷&lt;/li&gt;
&lt;li&gt;National Institute of Information and Communications Technology in Tokyo 🇯🇵&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;It was a pleasure to visit these important organisations working on GIS, and present our latest work.&lt;/p&gt;
&lt;p&gt;
&lt;figure id="figure-pusan-national-university"&gt;
&lt;div class="d-flex justify-content-center"&gt;
&lt;div class="w-100" &gt;&lt;img alt="Pusan National University" srcset="
/post/2019/05/22/visits-to-korea-and-japan/2_hu_1e712b3049add87c.webp 400w,
/post/2019/05/22/visits-to-korea-and-japan/2_hu_3c47778e18023904.webp 760w,
/post/2019/05/22/visits-to-korea-and-japan/2_hu_755e49226a9d9a85.webp 1200w"
src="https://ual.sg/post/2019/05/22/visits-to-korea-and-japan/2_hu_1e712b3049add87c.webp"
width="760"
height="570"
loading="lazy" data-zoomable /&gt;&lt;/div&gt;
&lt;/div&gt;&lt;figcaption&gt;
Pusan National University
&lt;/figcaption&gt;&lt;/figure&gt;
&lt;/p&gt;</description></item><item><title>Hello</title><link>https://ual.sg/post/2019/04/12/hello/</link><pubDate>Fri, 12 Apr 2019 11:21:19 +0800</pubDate><guid>https://ual.sg/post/2019/04/12/hello/</guid><description>&lt;p&gt;We are a new lab at the School of Design and Environment at the National University of Singapore focusing on topics related to geospatial technologies and urban analytics.
Please follow our website as we are building our research agenda, and feel free to get in touch with us.&lt;/p&gt;</description></item></channel></rss>