Sotabase

Career

· ML Foundation Engineer, NVIDIA2024–
· Teaching Assistant, Stanford University2024–2024
· ML Foundation Intern, NVIDIA2023–2023
· Master of Science - MS, Computer Science, Stanford University2023–2024
· Intern, Meta2022–2022
· Deep Learning Intern, NVIDIA2022–2022
· Participant, Stanford Machine Learning Group2021–
· Research Assistant, Stanford University2021–2023
· Schneider Fellow, U.S. Green Building Council2021–2021
· Research Assistant, Stanford Doerr School of Sustainability2020–2020
· Bachelor of Science - BS, Computer Science, Stanford University2019–2024

Publications (10)

Research on the Application of Deep Learning-based BERT Model with Additional Pretraining and Multitask Fine-Tuning
1
cited
BERT: Battling Overfitting with Multitask Learning and Ensembling
Combining Contrastive Learning with Layer Utility Analysis and Experimental Multi-Task Finetuning to Improve mini-BERT Performance
Fine-tuning minBERT for multi-task prediction
How to make your BERT model an xBERT in multitask learning?
Improving minBERT on Downstream Tasks Through Combinatorial Extensions
MinBERT and Downstream Tasks
Multi-task BERT Fine-Tuning with Gradient Tricks
QAN-et al.: Exploring Extensions on QANet
2022
Rhapsody on a Theme of Gradient Surgery: Variations to Improve minBERT for Multi-Task Learning
Sotabase
Timothy Dai | Researcher Profile | Sotabase | Sotabase