About

About

The Gradient is an organization with the missions of making it easier for anyone to learn about AI and of facilitating discussion within the AI community. We were founded in 2017 by a group of students and researchers at the Stanford AI Lab, and are now run by a collection of engineers, researchers, and PhD students. Our current projects include The Gradient Magazine, The Gradient Podcast, The Update newsletter, and the Mastodon instance Sigmoid Social.

We are a non-profit and volunteer-run effort run by researchers in the AI community. We were founded in 2017 by a group of students and researchers at the Stanford Artificial Intelligence Laboratory (SAIL).


Mastodon verification link

Interested in contributing?

Learn more about how you can write and get involved with The Gradient by filling out this short form.

Editorial Board

Hugh Zhang (he/him) is a graduate student at Harvard EconCS and a cofounder of the Gradient. His recent research interests include generative models, AI policy, game theory, and multi-agent reinforcement learning. In his spare time, he enjoys writing, playing Go and eating burgers at In-N-Out. Follow him on Twitter.

Andrey Kurenkov (he/him) is a PhD student with the Stanford Vision and Learning Lab. His work primarily focuses on applying deep reinforcement learning for robotic manipulation, with several publications utilizing supervised learning and imitation learning as well. Besides being a cofounder of The Gradient, he also founded the publication Skynet Today, created the Last Week in AI newsletter, and is a co-host of the Let's Talk AI podcast.

Daniel Bashir is a Machine Learning Compiler Engineer. His research interests have involved the intersection of machine learning and information theory. In 2021 he wrote the book "Towards Machine Literacy" to give an accessible introduction to a range of issues in AI ethics and governance. In his spare time, he enjoys reading fiction, playing violin, writing, cooking, and exercising. Daniel hosts, records, and produces The Gradient Podcast, and runs The Gradient's Update newsletter. Follow him on Twitter.

Justin Landay (they/them) completed their undergraduate and masters degrees from George Washington University, publishing numerous papers on machine learning applications in nuclear physics. They are now a Senior Data Scientist at Etsy, focusing on using deep learning to identify and mitigate disruptive and fraudulent behavior.

Bradly Alicea has a PhD from Michigan State University. With interests centered upon computational science, developmental biology, and cognitive systems, he is currently Head Scientist and Founder of Orthogonal Research and a Senior Contributor at the OpenWorm Foundation. Bradly is also the manager of open-source community activities at Rokwire and administrator of Synthetic Daisies blog.

Ather Fawaz is a software engineer at noon.com working in the ad-tech space. He takes a keen interest in deep learning (particularly GANs) and quantum computing, and has covered new research in these areas at Neowin.net. He's also authored a beginner-friendly course on quantum computing at educative.io. In his spare time, you'll find him engrossed in the world of papercraft, books, Formula 1 racing, and football. You can follow him on Twitter.

Marco Cognetta is a PhD student at the Tokyo Institute of Technology and a PhD Student Researcher at Google Tokyo. He is interested in federated learning, interpretability, and high school level computer science education. You can find him on Twitter.

Sharut Gupta is a PhD student at MIT CSAIL, whose research interests broadly lie in self supervised learning, robustness and out-of-distribution generalization. Sharut has previously worked at Meta AI, Google Research, Microsoft Research and MILA.

Jonathan Xue is a high school student in the Bay Area. He is interested in AI’s intersections with ethics and policy, and has written about global AI regulation along with fair use for language models. Currently, he is working on developing equitable optimization algorithms and finding transparent approaches for foundation model documentation. In his free time, Jonathan enjoys baking, trail running, and solving linguistic puzzles.


Alumni Editors

Eric Wang
Nancy Xu
Jordan Alexander
Philip Hwang
Simone Totaro
Yuge Shi
Stan Xie
Kai-Siang Ang
Mirantha Jayathilaka
Max Smith
Horace He
Steven Ban
Liam Li
Jessica Dai

Code of Conduct

Our code of conduct can be found here.