Remote SSH Access Without Static IP or Domain Name Using Cloudflare Zero Trust: A Step-by-Step Guide

During my undergraduate studies, I relied on a gaming laptop for most of my university projects because I needed a powerful CPU and GPU. Carrying it around in my backpack was cumbersome, but at the time, I didn’t see a better option. I couldn’t understand why anyone would choose an expensive lightweight laptop with better battery life or screen quality. Over time, my perspective has changed. If you can afford it, the ideal setup is a powerful desktop (or homelab) combined with a lightweight laptop....

Posted: December 12, 2024 · 7 min · Morteza Mirzaei

Scaling Deep Learning with Distributed Training: Data Parallelism to Ring AllReduce

In this post, we’ll explore the crucial role of distributed training in scaling Deep Neural Networks (DNNs) to handle large datasets and complex models. We’ll take an in-depth look at data parallel training, the most widely used technique in this domain, and dive into its implementation to provide an intuitive understanding of how it enhances efficiency in deep learning. Why Do We Need Distributed Training? There are several advantages to using distributed training, but in my opinion, these two are the most important ones that we will focus on:...

Posted: August 11, 2024 · Updated: December 12, 2024 · 9 min · Morteza Mirzaei