Hey there, this is the first edition of this newsletter where I share new things I find interesting in technology and AI (paper summaries, open-source, more) straight to your inbox!
My updates on Open source 🧑‍💻
My implementation of Conformer: Convolution-augmented Transformer for Speech Recognition, a Transformer Variant in
Released v0.2.0 supporting wider dependency versions and adds unit tests.
My writings đź“ť
We present a new challenging dataset, CPPE - 5, with the goal to allow the study of subordinate categorization of medical personal protective equipments, which is not possible with other popular data sets
Great reads from the communityđź“–
With the right combination of methods, ConvNets are better than Transformers for vision achieving 87.8% on ImageNet-1k beating Swin on COCO and ADE20K. Heats up the debate between ConvNets and Transformers.
Vectorization allows you to speed up processing of homogeneous data in Python. Learn what it means, when it applies, and how to do it.
At the 5 year mark of PyTorch, this article is a thread about some of the interesting/hard decisions and pivots they’ve had to make.
NNs learn through a process of “grokking” a pattern in the data, improving generalization performance from random chance level to perfect generalization, even after overfitting. After completely overfitting to the training set, generalization performance improves rapidly.
data2vec: the first general high-performance self-supervised algorithm for speech, vision, and text. Brings back DeepMind Perceiver memories.
A really interesting new approach for growing neural networks that aims to maximize the gradient norm when growing. Add new neurons during training without impacting what is already learned.
Open-source from community đź‘Ź
An open-source framework for generating chip floor plans with distributed deep reinforcement learning built on top of TF-Agents and TensorFlow 2.x
An interesting tech community initiative for North East region of India specifically for upcoming/existing community leaders.
That’s all, hope you liked this. Stay tuned for more updates. Feel free to submit any links for the next issue.
Regards,
Rishit Dagli