About

I will join Facebook AI Research (FAIR) this summer as part of AI residency program.

The nick name Taineleau is a substring of Fontainebleau because my given name (DANLU, 丹露) is derivated from the Chinese translation of Fontainebleau :).
I started programming when I was 10 and was admitted by Fudan University through programming contests (Olympiad in Informatics). After one and half years' wandering journey in Dept. of Philosophy, I eventually shifted back to Computer Science.

Educations
  • 2015/01 - NOW | Computer Science, Fudan University, GPA: 3.65, Major: 3.86, Rank: 10/119
  • 2017/01 - 2017/08 | Computer Science, Cornell University
  • 2016/08 - 2016/12 | Computing, National University of Singapore
  • 2013/09 - 2015/01 | Philosophy, Fudan University
Experiences
  • 2017/09 - 2017/12 | Research Intern @ Stanford University, Ermon Group
  • 2017/01 - 2017/08 | Research Intern @ Cornell University, Kilian's Group
  • 2016/08 - 2016/12 | Research Assistant @ NUS, LMS
  • 2016/02 - 2016/07 | Intern @ NVIDIA, Compute Arch, Deep Learning
  • 2015/09 - 2016/07 | Research Assistant @ Fudan NLP group
Next

Works

Multi-Scale Dense Convolutional Networks for Efficient Prediction (ICLR'18 Oral)

Gao Huang, Danlu Chen, Tianhong Li, Felix Wu, Laurens van der Maaten, Kilian Q. Weinberger

We introduce a novel convolutional neural network architecture with the ability to adapt dynamically to computational resource limits at test time.
— [Paper , Github]


LSH Softmax: Sub-Linear Learning and Inference of the Softmax Layer in Deep Architectures

Daniel Levy, Danlu Chen, Stefano Ermon

A softmax approximation layer for sub-linear learning and inference with strong theoretical guarantees.
Bridge Practice and Theory Workshop @ NIPS'18.

Memory-Efficient DenseNets

Geoff Pleiss*, Danlu Chen*, Gao Huang, Tongcheng Li, Laurens van der Maaten, Kilian Q. Weinberger

Most implementations of DenseNets create many intermediate outputs. Without proper management, the memory needed to store these outputs grows quadratically with network depth. This implementation uses shared-memory allocations to store intermediate outputs.
[Technical Report , Github]

Cached Long Short-Term Memory Neural Networks for Document-Level Sentiment Classification (EMNLP 2016)

Jiacheng Xu, Danlu Chen, Xipeng Qiu and Xuanjing Huang

We present a CLSTM to capture the overall semantic information in long texts. CLSTM introduces a cache mechanism, which divides memory into several groups with different forgetting rates and thus enables the network to keep sentiment information better within a recurrent unit.
[Paper , Github]

Next

Projects

  • Research on the large-scale softmax layer. During my internship at NVIDIA, I dived into the softmax layer from a high-performance computing aspect and, with the help of my mentors, evolved a proper multi-GPUs solution to improve the computation efficiency of a softmax layer in neural networks.
  • μνήμη (mneme, memory) is an app for beginners to memorize some basis of classical languages (such as Ancient Greek, Latin) using the web framework Django. With morphology rules written in Python, the APP is able to auto-generate tests for ancient language beginners. (will be publicly available later :D)
  • Pouring Ball won the 1st place in HackShanghai 2015. It's a multi-player interactive physical game developed in 24hrs with other three buddies. You can “pour” a ball between mobile devices, which will imitate the virtual actions in the real world. This game is built on accelerometer & gyroscope APIs of portable devices. A demo video.
  • Web Development & Design. Maintain several websites including my own homepage, Fudan NLP Group and so on.

Next

Honors

Programming Contest
  • Fudan ACM/ICPC 2015
    Best Female Participant & Second Prize (top 15%)
  • Guangdong (key) Olympiad in Informatics (GDKOI)
    First Prize (top 10%)
  • National Olympiad in Informatics (NOIp)
    First Prize
Hackathons
  • HackShanghai 2015
    Champion (Rank 1 out of 250 participants from 8 countries)
  • Google Female Student Hackathon
    Second Prize
Scholarships
  • Xiangqian Scholarship (20,000 CNY), 2017
    Shanghai Tangjunyuan Education Foundation
  • Women TechMaker Scholarship (former Google Anita Borg Memorial Scholarship), 2017
    Google Inc.
  • ITCSC-INC Winter School Scholarship (full reimbursement), 2017
    Chinese University of Hong Kong (CUHK)
  • Scholarship for Oversea Visiting Student, 2016
    Fudan University
  • Scholarship for Exchange Undergrad (7000 SGD), 2016
    China Scholarship Council (CSC), declined
  • China National Scholarship (top 2%), twice: 2016, 2015
    Ministry of Education of P.R.China
  • CCF 100 Elite Collegiate Award (top 0.1%), 2016
    China Computer Federation (CCF)
  • Excellent Student (top 1%), 2016
    Fudan University
  • Outstanding Undergraduate Scholarship, 2016
    EMC
  • Fudan Scholarship of Liberal Arts (top 30%), 2014
    Fudan University

Next

"Best of all possible worlds."

I enjoy the general issues of epistemology and probably Plato and Leibniz are my most favorite philosophers.

I am always obsessed with travel around the world. The background photo was taken by my mother in Melbourne, Australia.

I love classical music, especially works of Beethoven's. Recently, I am fond of Ballet (music).

I can speak Cantonese (native), Chinese Mandarin (native) and English, and I am able to read preliminary Ancient Greek and Japanese.

I learned Chinese painting and Ballet for several years when I was a little girl. I also play piano, but only at the beginner's level (took piano class every week for a year).