fbpx

Latest Posts

  • Suno AI V3 Alpha: Music Generation

    ·

    Suno AI V3 Alpha: Music Generation

    In the ever-evolving landscape of artificial intelligence and music, Suno AI has emerged as a beacon of innovation. The introduction of Suno AI V3 Alpha marks a significant milestone in text-to-song technology, offering an unparalleled experience in music creation. This latest version not only expands on the capabilities of its predecessors but also introduces a…

  • The Evolution of Model Compression in the LLM Era

    ·

    The Evolution of Model Compression in the LLM Era

    Image by author using DALL-E 3. The advent of transformers in 2017 set off a landslide of AI milestones, starting with the spectacular achievements of large language models (LLMs) in natural language processing (NLP), and quickly catalyzing advancement in other domains such as computer vision and robotics. The unification of NLP and computer vision problems…

  • Visualisation 101: Choosing the Best Visualisation Type

    ·

    Visualisation 101: Choosing the Best Visualisation Type

    Comprehensive guide for different visualisation use cases I believe that the primary goal of analysts is to help their product teams make the right decisions based on data. It means that the main result of analysts’ work is not just getting some numbers or dashboards but influencing reasonable data-driven decisions. So, presenting the results of…

  • The full training run of GPT-5 has gone live

    ·

    The full training run of GPT-5 has gone live

    We can expect it to be released in November, maybe on the 2nd anniversary of the legendary ChatGPT launch In similar timeframes, we will also be getting Gemini 2 Ultra, LLaMA-3, Claude-3, Mistral-2 and many other groundbreaking models (Google’s Gemini already seems to be giving tough competition to GPT-4 turbo) It is almost certain that…

  • 7 Books to Read on Artificial Intelligence

    ·

    7 Books to Read on Artificial Intelligence

    2024 is thought to be the “year of AI”, where we will see even more breakthroughs than in 2023. In this post I will share some of the most interesting books about Artificial Intelligence I have been reading lately, together with my own thoughts: Life 3.0 Image by author. In this book, Tegmark talks about…

  • Understanding Parameter-Efficient Finetuning of Large Language Models: From Prefix Tuning to LLaMA-Adapters

    ·

    Understanding Parameter-Efficient Finetuning of Large Language Models: From Prefix Tuning to LLaMA-Adapters

    In the rapidly evolving field of artificial intelligence, utilizing large language models in an efficient and effective manner has become increasingly important. Parameter-efficient finetuning stands at the forefront of this pursuit, allowing researchers and practitioners to reuse pretrained models while minimizing their computational and resource footprints. It also allows us to train AI models on…

  • Some Techniques To Make Your PyTorch Models Train (Much) Faster

    ·

    Some Techniques To Make Your PyTorch Models Train (Much) Faster

    This blog post outlines techniques for improving the training performance of your PyTorch model without compromising its accuracy. To do so, we will wrap a PyTorch model in a LightningModule and use the Trainer class to enable various training optimizations. By changing only a few lines of code, we can reduce the training time on…

  • Optimizing Memory Usage for Training LLMs and Vision Transformers in PyTorch

    ·

    Optimizing Memory Usage for Training LLMs and Vision Transformers in PyTorch

    Peak memory consumption is a common bottleneck when training deep learning models such as vision transformers and LLMs. This article provides a series of techniques that can lower memory consumption by approximately 20x without sacrificing modeling performance and prediction accuracy. Introduction In this article, we will be exploring 9 easily-accessible techniques to reduce memory usage…

  • Improving LoRA: Implementing DoRA from Scratch

    ·

    Improving LoRA: Implementing DoRA from Scratch

    Low-rank adaptation (LoRA) is a machine learning technique that modifies a pretrained model (for example, an LLM or vision transformer) to better suit a specific, often smaller, dataset by adjusting only a small, low-rank subset of the model’s parameters. This approach is important because it allows for efficient finetuning of large models on task-specific data,…