Featured Researcher: Yunlong (Draco) Xu '23
About the Student Researcher
Major(s) and minor(s)
BS in honors mathematics; BS in honors computer science
Minor in philosophy
What's your research story?
Research Interests: My dream is to unravel the mystery of the human mind and intelligence. I am interested in uncovering the patterns of neural activity that generate mental processes and behavior. I want to explore essential problems, including the theory of information processing performed by neural systems, principles of Intelligence, and computational models of perception, cognition, and consciousness. These rather ambitious goals guide my interest in science. In the end, I am curious about how the brain works. And I believe that focusing on the computational modeling of brain function is an important strategy to accomplish this important goal. I believe that a better understanding of how the brain does computation will also contribute to building better AI. We may not be there yet, but I dedicate myself to helping us get there.
Besides the principle of intelligence which lies at the intersection of Computer Science and Neuroscience, I am also interested in how we can practically build better systems in the real world, particularly in making better use of data, which encompasses databases and data management. Now we are in a time exponentially increasing data volumes providing limitless opportunities for interdisciplinary collaboration, and I believe that providing a powerful data management tool and developing cutting-edge techniques for data analysis will help the world answer key scientific questions, which is what I am excited to to and want to put my passion on.
Computational Neuroscience: Advances in AI research and computer science provide us with great opportunities to investigate how the brain works by either simulating neural computation or directly looking at the brain. To learn how cognition is implemented in the brain, an important strategy is to build computational models that can perform cognitive tasks, and test such models with brain and behavioral experiments. With Prof. Duje Tadin, Prof. Ruyuan Zhang, and Prof. Oh-Sang Kwon, I have been employing spiking neural networks (SNNs) and deep neural networks (DNNs) to investigate the temporal evolutions of perceptual choice. Perceptual decisions involve a process that evolves until it reaches a decision boundary. It is important to understand how this process unfolds. Recent psychophysical data indicate that the visual system extracts motion axis information faster than motion direction information [1]. Based on a shared pattern found between trained RNNs and DNNs, my work shows that the spatiotemporal filtering for visual motion integration, the center-surround antagonism, and stronger axis-wise inhibitory connection between the selective neural populations can explain how the visual system can extract motion axis orientation before detecting motion direction. I have presented this work in VSS-2022 [9] and OSAFVM-2022 [6] as posters.
Currently, I am working to make this story more complete. My efforts include neuroimaging working with MEG to uncover corresponding brain dynamics in motion processing. I am also working on expanding the computational side of this project. Specifically, I am exploring the question: Why the brain and the supervised deep neural network based on human behavior both form this shared pattern? My hypothesis is that it may relate to the visual learning experience in infants. Therefore, I am using unsupervised biologically plausible deep neural networks, such as Prednet [4], to test the hypothesis. Besides, I also keep investigating how to train spiking neural networks more efficiently. My recent update has been submitted to COSYNE 2023 [5]. With Prof. Ruyuan Zhang, I also worked on visual abstract reasoning. We proposed the first semi-supervised learning framework for the Raven’s progressive matrices learning task. This work is in preprint and will be submitted to IJCAI 2023 [12] once the submission site opens. My work in vision research has been recognized, and I was awarded the Walt and Bobbi Makous Prize.
In 2021, I joined CogT Lab at Stanford. Working with Prof. Vankee Lin and Prof. Ehsan Adeli, I am investigating stress neurobiology in both human and animal brains with computational approaches. For humans, we collected co-current FMRI and ECG data, and measured subjects’ perceived stress. I built the spatio-temporal graph convolution network to find the temporal pattern of brain regions, functional connectivity, and brain-heart related to stress. The key challenge here is how to utilize the two modalities given the limited sample size. Our animal work is data-driven analysis in mice, which I presented at SFN 2022 [10] as a poster. Xerox Foundation awarded me David T. Kearns Funding to continue my studies.
Throughout these experiences, I learned that there are unlimited approaches from different scales and perspectives to answer the most important scientific questions. As the expansion in the size, scope, and complexity of data acquired from large portions of systems and spanning multiple levels of the organization, alongside directly answering the scientific questions, I find that providing scientific data analysis platforms and tools is beneficial to the scientific community and the boarder data-intensive industries. That’s why I have put a lot of effort into database and data management research. I have been working with Prof. Fatemeh Nargesian on this. In recent years, there is a demand in the scientific community for network dynamics analysis on large-scale streaming data. For example, in neuroscience, people are making large efforts to systematically map functional connectivity among all pairs of millimeter-scale brain regions based on large neuroimaging databases. To enable it, we solved the problem of efficient computation and update of the correlation matrix for user-defined time-windows on historical and real-time data. We presented TSUBASA, an algorithm for efficiently computing the exact pair-wise time-series correlation based on Pearson’s correlation. By pre-computing simple and low-overhead sketches, TSUBASA can efficiently compute exact pairwise correlations on arbitrary time windows at query time. For real-time data, TSUBASA proposes a fast and incremental way of updating the correlation matrix. Our experiments show that TSUBASA is at least one order of magnitude faster than a baseline for both historical and real-time data. This work has been published in SIGMOD 2022 [7], and I was awarded SIGMOD Student Travel Grant to give a talk there.
Following this, we presented TSUPY, a Python library, which extends Jupyter Notebook as instrumentation for performing network construction and analysis at interactive speed. It has been published in CIKM 2022 [2], where I presented our work for our group (Thanks to the CIKM Travel Grant). The paper focuses on how TSUPY enables dynamic network analysis of climate data. We also show how TSUPY can be applied to neuro-imaging to understand the functional connectivity among brain regions. Recently, I designed TSUBASA++, which improves the performance of TSUBASA further for the sliding window scenario. It prunes the pair-wise correlation computation based on the temporal prediction. We designed the first benchmark in correlation network construction problem to evaluate the robustness and efficiency of TSUBASA++, whose results made it truly persuasive, as TSUBASA++ is one-order of magnitude faster than TSUBASA, and its accuracy is overwhelmingly superior to the most recent Parcorr model [13]. I presented part of this work at MIT [8], and we are planning to submit the full paper to VLDB 2023 [11]. These projects have been recognized, and I was awarded the Discover Grant as a Schwartz Fellow for further conducting the studies.
Another project I worked on in database is investigating sampling over union of joins. Data scientists often draw on multiple relational data sources to collect training data. A standard assumption in machine learning is that training data is an i.i.d sample of the underlying distribution. Given a set of joins, we consider the problem of obtaining a random sample from the union of joins without performing the full join and union. We present a general framework for random sampling over the set and disjoint union of chain, acyclic, and cyclic joins, with sample uniformity and independence guarantees. We study the novel problem of union size approximation of joins and propose a direct way and an online aggregation way of approximating the overlap size of joins. We evaluate our framework on workloads from the TPC-H benchmark and explore the trade-off of the accuracy of union approximation and sampling efficiency. This work has been submitted to SIGMOD 2023 [3]. My work in Computer Science has been recognized, and I was nominated for the CRA Outstanding Undergraduate Researchers by the ÂÒÂ×Ç¿¼é, and was awarded nationwide honorable mention.
How did you initially secure your research position?
Prof. Fatemeh Nargesian emailed the department told us she is looking for RA. I emailed here and went through interview.
Departments/programs of research
Tadin Lab, Department of Brain and Cognitive Sciences, ÂÒÂ×Ç¿¼é
CogT Lab, Research Assistant, Stanford University
CCNN Lab, Shanghai Jiao Tong University
Data Intelligence Lab, Dept of Computer Science, ÂÒÂ×Ç¿¼é
Department of Philosophy, ÂÒÂ×Ç¿¼é
Department of Mathematics, ÂÒÂ×Ç¿¼é
Any research presentations, awards, or publications?
Published:
1. Yunlong Xu, Jinshu Liu, and Fatemeh Nargesian. 2022. TSUBASA: Climate Network Construction on Historical and Real-Time Data. In Proceedings of the 2022 International Conference on Management of Data (SIGMOD '22).
2. Jinshu Liu, Yunlong Xu, Fatemeh Nargesian, and Gourab Ghoshal. 2022. TSUPY: Dynamic Climate Network Analysis Library. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management (CIKM '22).
3. Yunlong Xu, Adam Turnbull, Ju Lu, Yi Zuo, Feng Vankee Lin. 2022. Data-driven analysis in mice reveals whole-brain dynamics along a known anterior-posterior axis that inform the neural basis of behavior, the 50th Society for Neuroscience’s Annual Meeting (SFN ’22).
4. Yunlong Xu, Duje Tadin, Oh-Sang Kwon, Ruyuan Zhang. 2022. Investigating temporal evolution of motion direction judgments within a biophysically realistic network, the 2022 Annual Meeting of the Vision Sciences Society (V-VSS ’22).
5. Yunlong Xu, Oh-Sang Kwon, Ruyuan Zhang, Duje Tadin. 2022. Investigating temporal evolutions of perceptual choice within biological and artificial neural networks, the 2022 Optica Fall Vision Meeting (OPTICA FVM ’22).
6. Yunlong Xu, Jinshu Liu, Peizhen Yang, Noah Viso, Fatemah Nargesian. 2022. TSUBASA-PLUS: Correlation Matrix Computation on Sliding Windows, the 2022 IEEE MIT Undergraduate Research Technology Conference (MIT URTC ’22).
7. Yurong Liu*, Yunlong Xu*, Fatemah Nargesian. 2022. Join Size Estimation Over Union of Join Paths, the 2022 IEEE MIT Undergraduate Research Technology Conference (MIT URTC ’22). In Review:
8. Yurong Liu*, Yunlong Xu*, Fatemah Nargesian, Sampling over Union of Joins, Submitted to the 2023 ACM Special Interest Group on Management of Data (SIGMOD ’23).
9. Yunlong Xu, Training Reduced Recurrent Spiking Neural Network in Perceptual Choice, the 2023 Computational and Systems Neuroscience (COSYNE Abstracts ’23) In preprint:
10. Yunlong Xu, Linxiao Yang, Ruyuan Zhang, Semisupervised Visual Abstract Reasoning by Rule-Based Augmentation