Research Projects by NRF Fellows

Algorithmic prediction and practical control of large scale population dynamics

georgios - 600

Our goal is to develop practical tools for understanding phenomena that arise in our everyday experiences when interacting with large crowds. Traffic patterns, for example, form as a result of the routing decisions made independently by millions of individual drivers. These intertwined decisions hopefully result most of the time in a smooth traffic flow, but can sometimes lead to traffic jams with large delays and millions of man-hours lost. Similar phenomena dominate every aspect of the social landscape and dictate the evolution of social norms. For example, what are people willing to pay for rent and where? Which schools/specialisations become popular or unpopular? Which features of these systems' behaviour can be systematically predicted and which are effectively random? In the traffic routing example, we may not be able to predict the exact time and location of traffic jams but nevertheless, maybe we can estimate their frequency or their severity.

Secondly, we aim to explore novel and efficient ways to control and improve such systems. For example, a standard way to adjust traffic is via the use of tolls. For most tasks, this is an impractical way of controlling behaviour as it requires expensive infrastructure. Other more readily applicable techniques would introduce small nudges to the behavioural dynamics (e.g. via targeted public service advertising) that would result in long lasting improvement of system performance. 
 
Understanding these questions requires bridging two largely distinct disciplines, computer science (e.g. the study of fast and efficient algorithms) and dynamical systems (e.g. chaos theory, butterfly effect). These connections allow us to peer deeper into the reality of everyday systems and open up new pathways for improving the efficiency of modern cities, such as Singapore.

fellow2018georgiospiliouras

Dr Georgios Piliouras
Singapore University of Technology and Design
Engineering Systems and Design



Analytics for dynamic urban risk and disaster impact modeling

lallemant - 600

The ability of our cities to thrive in the face of uncertain future risks will depend on our ability to predict the consequences of such disasters so that we can mitigate them, and our capacity to effectively respond and recover from them when they occur. The proposed research addresses this dual challenge:
 
(1)   The risk profile of cities, particularly Asian cities, is rapidly changing due to climate change, urbanisation, and new patterns of vulnerability. The development of large-scale risk analysis models that can account for such dynamics will enable us to anticipate future trends in disaster risk and guide our cities towards resilient trajectories. This will be accomplished by creating time-dependent models for each of the three components that constitute risk (hazard, exposure, vulnerability). The end product will be a large-scale dynamic risk simulation model applicable to a wide-range of cities and hazards. By anticipating future risk, the proposed model is particularly powerful as it focuses on mitigation of risk before it is even created rather than addressing it after the fact, which is often prohibitively expensive or politically infeasible.
 
(2)   In the immediate aftermath of a natural disaster, the first estimates of overall impact are based on an ad-hoc combination of predictive impact models, fly-over reconnaissance missions and field reports. The proposed work focuses on the integration of predictive models with new “big-data” streams such as satellites, distributed sensors, drones and social media, in order to get disaster impact estimates with greatly increased accuracy and precision. New approaches for the statistical analysis of geospatial data will be used to transform these data into effective decision-support tools for disaster response and recovery planning. This second research component is linked to the first, since it is through the study of post-disaster consequences that new indicators of pre-disaster vulnerability can be identified.
 

fellow2018davidlallemant

Dr David Lallemant
Nanyang Technological University
Earth Observatory of Singapore



Designing novel quantum materials at the microscopic level with ultracold molecules 

huanqian - 600

Modern-day electronics have transformed our lives by making a world of information and entertainment available at our fingertips. From mobile phones to smart sensors, the high portability of such devices is enabled by the shrinking of electronic components over the last several decades. As we approach the limit of size reduction, new materials with customisable properties hold much promise to provide a disruptive breakthrough in the electronics industry. For instance, regular wires could be replaced with superconductors or flexible ‘plastic metals’: imagine a world where mobile phones and laptops are paper-thin yet sturdy, and can be charged on-the-go by solar cells woven into our clothing! 
 
Despite the lure of these new materials, they remain poorly understood at the microscopic level. The main challenge comes from the fact that the microscopic picture involves quantum mechanical particles that are strongly interacting, so even an approximate model demands an unreasonably large number of parameters, making such simulations difficult or even impossible to perform on the most powerful computers. 
 
My proposed research aims to mitigate this problem by assembling models of these materials in a clean, versatile laboratory setting - atom by atom, molecule by molecule. At low temperatures, these molecules act like quantum nanomachines that mimic the behaviour of new materials on the microscopic level. The molecules will be held by crystals of light, much like electrons that are embedded in crystals of atoms in these materials. Using these molecular building blocks, I will study materials where the movement of electrons and heat can be sped up or slowed down as desired and where information can be stored robustly for a long time. Such model materials with precisely engineered properties will impact the development of next-generation electronics and future computing technologies that are highly powerful yet energy efficient.

fellow2018lohhuanqian

Dr Loh Huanqian
National University of Singapore
Department of Physics 



Fundamental performance limits for statistical learning algorithms 

vincent - 600

Enormous amounts of data are constantly generated from various sources: low-cost sensors for collecting climate data, high-throughput sequencing technology for obtaining DNA sequences, political surveys for amassing polling data, and social media sites where users post photos and tweets, to name a few. Given climate data over the past year, a weather scientist would want to predict future weather patterns. However, the sheer amount of weather records available to the scientist demands that increasingly sophisticated data analytics techniques be used to extract pertinent knowledge efficiently and accurately.
 
While many state-of-the-art techniques are available in analysing data and ensuring that predictions are reliable, one aspect that modern scientific discovery has neglected is in establishing impossibility results. To explain what this means intuitively, let us recall that the speed of any matter cannot exceed the speed of light. This means that the speed of light is a “fundamental limit”. My research programme similarly is in establishing fundamental limits but pertaining to data analysis. A question that I would like to ask is: Given climate data of a certain quality and quantity over the past year, what is the variance or the reliability of predictions she can make about weather patterns in the next year? We can imagine that if the data collected are sparse and extremely noisy, the predictions cannot be arbitrarily reliable. Quantifying the expected reliability of predictions is the crux of the present research program. The mathematical toolkit that we will draw from is inspired by the technology that goes into one’s cellphone. 

In 1948, Shannon developed a paradigm, known as information theory, to understand the fundamental limits of communications across noisy media. We use mathematical techniques in information theory to study the fundamental limits of various machine learning tasks from weather prediction to uncovering hidden communities in Facebook networks.

fellow2018vincenttan

Dr Vincent Tan
National University of Singapore
Department of Electrical and Computer Engineering 



Investigating altered metabolic pathways in neurodegeneration using organoids 

shiyan - 600

Amyotrophic Lateral Sclerosis (ALS), also known as Lou Gehrig’s disease, is an age-onset fatal neurodegenerative disorder that causes death of motor neurons, resulting in progressive paralysis of all voluntary muscles. Approximately half a million people worldwide suffers from ALS and in aging societies including Singapore, the incidence of ALS will increase. Unfortunately, there is no cure for ALS and current treatment options do not significantly prolong lifespan or slow disease progression.

Patient-derived stem cells or induced pluripotent stem cells (iPSCs) have become invaluable tools for studying disease progression because various cell types, including neurons and astrocytes, can be derived from this source of stem cells. However, the conventional method for making neurons produces mostly immature fetal neurons that hardly recapitulate age-onset ALS.

By growing these neurons in spheres or tubes like a mini-spinal cord, we mimic the topological properties and composition of a human spinal cord to achieve maturation of iPSC-derived neurons such that cellular and molecular features of ALS will be exhibited. With this advanced model of ALS that accurately models the disease, we aim to investigate metabolic dysfunctions in ALS motor neurons and astrocytes – two key players in the pathogenesis of ALS.

fellow2018ngshiyan

Dr Ng Shi Yan
Agency for Science, Technology and Research
Institute of Molecular and Cell Biology



Layerwise microstructure scanner: A new technology for online materials assessment in additive manufacturing

matteo - 600

3-D printing or additive manufacturing (AM) provides the unique capability of making solid objects with arbitrary shapes by adding thin layers of material together. One downside of this remarkable technology is that defects that are difficult to detect and material heterogeneities can be introduced at the layer-level throughout the entire volume of the builds. Because of these imperfections, 3-D printed materials are generally less reliable than their conventionally made counterparts and thus have limited applications to industry.
 
The goal of the proposed research is to develop a new technology to “scan” successive layers of material during AM and identify—and possibly correct—such imperfections. The technology will be based on optical microscopy to ensure fast data acquisition speed, low-cost, and compatibility with most AM processes that are commercially available. By characterising material layers “on-the-fly”, the 3-D distribution of defects and material heterogeneities in the printed parts will be directly available upon manufacturing. For this reason, it will be possible to accurately predict and even “engineer” the performance and reliability of 3-D printed components without the need for costly post-production quality assessments or additional materials processing.
 
Because of these capabilities, the technology to be developed has the potential to boost the economic viability and employment of AM in different industries, such as the aerospace industry. Moreover, the optical-based techniques at the core of this technology may be used as fast and high-throughput materials characterisation methods for academic research.
 

fellow2018matteoseita

Dr Matteo Seita
Nanyang Technological University
School of Mechanical and Aerospace Engineering



Role of non-canonical NF-kB signaling in the transcriptional and epigenetic reprogramming of malignant cancers

yinghui - 600

The nuclear factor κB (NF-κB) family of transcription factors regulates the expression of a broad range of inducible genes critical for various biological processes such as inflammation and immune responses. While a vast majority of research has focused on the causal role of canonical NF-κB signalling in several human ailments, emerging studies have begun to shed light on the clinical relevance of non-canonical NF-κB signalling in multiple human diseases. In particular, the aberrant activation of non-canonical NF-κB signalling has been frequently observed in many human cancers. The primary goal of this research is to elucidate how deregulated non-canonical NF-B signalling regulates the gene expression programs of some of these cancers to promote their malignant development.
 
We will investigate how noncanonical NF-κB factor binds on regulatory DNA elements to activate a cancer-promoting gene expression program which is conducive for the malignant development of cancer cells. The proposed research will uncover the functional link between non-canonical NF-κB signalling and cancer malignancies, which can be further extended to autoimmune and metabolic disorders that are casually linked to deregulated non-canonical NF-κB signalling. We will use high-throughput methods such as ChIP sequencing and RNA sequencing that is complemented by genome-editing strategies as well as in vitro and in vivo functional assays, to uncover the underlying mechanisms and the impact of these factors in the transcriptional control of cancer development. These studies will lead to our eventual goal of identifying novel therapeutic targets and to develop new strategies for the selective disruption of cancer-specific transcriptional programs.

fellow2018liyinghui

Dr Li Yinghui
Nanyang Technological University
School of Biological Sciences


Back to top