Latest videos

Generative AI
2,607,086 Views · 3 years ago

The Memo: https://lifearchitect.ai/memo/

====
AI Report Card: https://lifearchitect.ai/report-card/
GPT-3 paper: https://arxiv.org/abs/2005.14165
Other papers: https://lifearchitect.ai/papers
Mid-20220 report + video: https://lifearchitect.ai/the-sky-is-bigger/
What's in my AI: https://lifearchitect.ai/whats-in-my-ai/
====

Read more: https://lifearchitect.ai/
https://lifearchitect.ai/models/

Dr Alan D. Thompson is a world expert in artificial intelligence (AI), specialising in the augmentation of human intelligence, and advancing the evolution of ‘integrated AI’. Alan’s applied AI research and visualisations are featured across major international media, including citations in the University of Oxford’s debate on AI Ethics in December 2021.
https://lifearchitect.ai/

Music:
Under licence.

Liborio Conti - Looking Forward (The Memo outro)
https://no-copyright-music.com/

Generative AI
2,383,215 Views · 3 years ago

ChatGPT is a variant of the GPT (Generative Pre-training Transformer) language model specifically designed for chatbot applications. It was developed by OpenAI and is trained to generate human-like text responses given a prompt or context. It can be used to build chatbots that can carry on conversations with users in a natural and engaging way.

This videos demonstrates a few things that can be done with chatgpt in Tamil. Particularly, with respect to coding - fixing bugs, code explanation, generating methods etc,.

chatgpt link - https://chat.openai.com/chat

Github CodeLink - https://github.com/LogicFirstTamil

---------------------------------------- courses and playlists --------------------------------------------------
HTML and CSS: https://www.youtube.com/playli....st?list=PLYM2_EX_xVv
SQL: https://www.youtube.com/playli....st?list=PLYM2_EX_xVv
DS and ALGO in C/CPP: https://www.youtube.com/watch?v=QQBVNwA9rDI&list=PLYM2_EX_xVvVMXkQt4qqosJTplBq5v5oX
DS and ALGO in Java: https://www.youtube.com/watch?v=t2U989oaI1Q&list=PLYM2_EX_xVvX7_AmNY-Deacp3rT3MIXnE
Python Full Course with game: https://www.youtube.com/watch?v=BiDOehqG68g
Java Playlist: https://www.youtube.com/playli....st?list=PLYM2_EX_xVv
Java one video: https://www.youtube.com/watch?v=qOWPCPCDRZs&t=2751s
C Interview program playlist: https://www.youtube.com/playli....st?list=PLYM2_EX_xVv
C programming in one video: https://www.youtube.com/watch?v=JAy56OH58Y4&t=17269s
C programming playlist: https://www.youtube.com/watch?v=xKGg8UfR0mM&list=PLYM2_EX_xVvU7N5Lcp1SjKnGN5e6riUoj
C++ Playlist link: https://www.youtube.com/playli....st?list=PLYM2_EX_xVv

English channel link: https://www.youtube.com/channe....l/UCFhHB5_2UkzB4V5iL

Generative AI
2,095,613 Views · 3 years ago

github: https://github.com/krishnaik06..../Huggingfacetransfor
In this tutorial, we will show you how to fine-tune a pretrained model from the Transformers library. In TensorFlow, models can be directly trained using Keras and the fit method. In PyTorch, there is no generic training loop so the 🤗 Transformers library provides an API with the class Trainer to let you fine-tune or train a model from scratch easily.
---------------------------------------------------------------------------------------------------------------------------------------------------------------
⭐ Kite is a free AI-powered coding assistant that will help you code faster and smarter. The Kite plugin integrates with all the top editors and IDEs to give you smart completions and documentation while you’re typing. I've been using Kite for a few months and I love it! https://www.kite.com/get-kite/?utm_medium=referral&utm_source=youtube&utm_campaign=krishnaik&utm_content=description-only
Subscribe my vlogging channel
https://www.youtube.com/channe....l/UCjWY5hREA6FFYrthD
Please donate if you want to support the channel through GPay UPID,
Gpay: krishnaik06@okicici
Telegram link: https://t.me/joinchat/N77M7xRvYUd403DgfE4TWw

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
https://www.youtube.com/channe....l/UCNU_lfiiWBdtULKOw

Connect with me here:
Twitter: https://twitter.com/Krishnaik06
Facebook: https://www.facebook.com/krishnaik06
instagram: https://www.instagram.com/krishnaik06

Generative AI
2,023,851 Views · 3 years ago

GPT-3 is super intelligent NLP deep learning model. In order to understand GPT-3 or later version, we should understand fundamental basic of it, and this video is covering the basic of GPT which covers before the transformer, and summary of transformer and the basic fundamental of GPT by looking at GPT-1.

Generative AI
3,000,888 Views · 3 years ago

This short tutorial covers the basics of the Transformer, a neural network architecture designed for handling sequential data in machine learning.

Timestamps:
0:00 - Intro
1:18 - Motivation for developing the Transformer
2:44 - Input embeddings (start of encoder walk-through)
3:29 - Attention
6:29 - Multi-head attention
7:55 - Positional encodings
9:59 - Add & norm, feedforward, & stacking encoder layers
11:14 - Masked multi-head attention (start of decoder walk-through)
12:35 - Cross-attention
13:38 - Decoder output & prediction probabilities
14:46 - Complexity analysis
16:00 - Transformers as graph neural networks

Original Transformers paper:
Attention is All You Need - https://arxiv.org/abs/1706.03762

Other papers mentioned:
(GPT-3) Language Models are Few-Shot Learners - https://arxiv.org/abs/2005.14165
(DALL-E) Zero-Shot Text-to-Image Generation - https://arxiv.org/abs/2102.12092
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding - https://arxiv.org/abs/1810.04805
Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity - https://arxiv.org/abs/2101.03961
Finetuning Pretrained Transformers into RNNs - https://arxiv.org/abs/2103.13076
Efficient Transformers: A Survey - https://arxiv.org/abs/2009.06732
Attention is Not All You Need: Pure Attention Loses Rank Doubly Exponentially with Depth - https://arxiv.org/abs/2103.03404
Do Transformer Modifications Transfer Across Implementations and Applications? - https://arxiv.org/abs/2102.11972
Gradient Flow in Recurrent Nets: the Difficulty of Learning Long-Term Dependencies - https://ml.jku.at/publications/older/ch7.pdf
Transformers are Graph Neural Networks (blog post) - https://thegradient.pub/transf....ormers-are-graph-neu

Video style inspired by 3Blue1Brown

Music: Trinkets by Vincent Rubinetti

Links:
YouTube: https://www.youtube.com/ariseffai
Twitter: https://twitter.com/ari_seff
Homepage: https://www.ariseff.com

If you'd like to help support the channel (completely optional), you can donate a cup of coffee via the following:
Venmo: https://venmo.com/ariseff
PayPal: https://www.paypal.me/ariseff

Generative AI
2,655,257 Views · 3 years ago

14 Cool applications just built on top of OpenAI's GPT-3 (generative predictive transformer) API (currently in private beta).


► Remember to Like, Comment, and Subscribe!

Connect with me:
Twitter - http://www.twitter.com/bakztfuture
Instagram - http://www.instagram.com/bakztfuture
Github - http://www.github.com/bakztfuture

Feel free to send me an email or just say hello:
bakztfuture@gmail.com

Generative AI
2,749,809 Views · 3 years ago

AI Papers Reading and Coding - Transformer: Attention Is All You Need

Hôm nay chúng ta sẽ đọc và lập trình bài báo Transformer: Attention Is All You Need. Đây là video diễn giải phần tiếp theo: Decoder. Một ứng dụng nổi tiếng của Transformer Decoder chính là mô hình sinh GPT, bạn có thể sinh hình quả bơ từ một câu. :D

Đây chính là mô hình xương sống cho những mô hình Deep Learning trong xử lý ngôn ngữ tự nhiên hiện tại.

Code bài báo: https://github.com/bangoc123/transformer

Đây là nội dung trong chuỗi video Papers-Videos-Code: https://protonx.ai/papers-videos-code/

Generative AI
2,033,150 Views · 3 years ago

Want to get your hands on GPT3 but cbbd waiting for access?

Need to kick off some AI powered text gen ASAP?

Want to write a love song using AI?

I got you!

In this video, you'll learn how to leverage GPT Neo, a GPT3 architecture clone, which has been trained on 2.7 BILLION parameters to generate text and code. You'll learn how to get setup and leverage the model for a whole range of use cases in just 4 lines of code!

In this video you’ll learn how to:
1. Install GPT Neo a 2.7B Parameter Language Model
2. Generate Python Code using GPT Neo
3. Generate text using GPT Neo and Hugging Face Transformers

GET THE CODE FROM THE VIDEO: https://github.com/nicknochnack/GPTNeo

Code from the previous tutorial:
https://github.com/nicknochnack/MediaPipeHandPose

Chapters:
0:00 - Start
1:15 - How it Works
2:45 - Install PyTorch and Transformers
5:08 - Setup GPT Neo Pipeline
8:08 - Generate Text from a Prompt
14:46 - Export Text to File
17:48 - Wrap Up

Oh, and don't forget to connect with me!
LinkedIn: https://bit.ly/324Epgo
Facebook: https://bit.ly/3mB1sZD
GitHub: https://bit.ly/3mDJllD
Patreon: https://bit.ly/2OCn3UW
Join the Discussion on Discord: https://bit.ly/3dQiZsV

Happy coding!
Nick

P.s. Let me know how you go and drop a comment if you need a hand!

Generative AI
2,194,541 Views · 3 years ago

Simpletransformer library is based on the Transformers library by HuggingFace. Simple Transformers lets you quickly train and evaluate Transformer models. Only 3 lines of code are needed to initialize a model, train the model, and evaluate a model.
github: https://github.com/krishnaik06/Trnasformer-Bert
simple transformer: https://simpletransformers.ai/docs/qa-specifics/
simple transformer github: https://github.com/ThilinaRaja....pakse/simpletransfor
⭐ Kite is a free AI-powered coding assistant that will help you code faster and smarter. The Kite plugin integrates with all the top editors and IDEs to give you smart completions and documentation while you’re typing. I've been using Kite for a few months and I love it! https://www.kite.com/get-kite/?utm_medium=referral&utm_source=youtube&utm_campaign=krishnaik&utm_content=description-only
Subscribe my vlogging channel
https://www.youtube.com/channe....l/UCjWY5hREA6FFYrthD
Please donate if you want to support the channel through GPay UPID,
Gpay: krishnaik06@okicici
Telegram link: https://t.me/joinchat/N77M7xRvYUd403DgfE4TWw

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
https://www.youtube.com/channe....l/UCNU_lfiiWBdtULKOw

Connect with me here:
Twitter: https://twitter.com/Krishnaik06
Facebook: https://www.facebook.com/krishnaik06
instagram: https://www.instagram.com/krishnaik06

Generative AI
2,970,876 Views · 3 years ago

AI/ML has been witnessing a rapid acceleration in model improvement in the last few years. The majority of the state-of-the-art models in the field are based on the Transformer architecture. Examples include models like BERT (which when applied to Google Search, resulted in what Google calls "one of the biggest leaps forward in the history of Search") and OpenAI's GPT2 and GPT3 (which are able to generate coherent text and essays).

This video by the author of the popular "Illustrated Transformer" guide will introduce the Transformer architecture and its various applications. This is a visual presentation accessible to people with various levels of ML experience.


Intro (0:00)
The Architecture of the Transformer (4:18)
Model Training (7:11)
Transformer LM Component 1: FFNN (10:01)
Transformer LM Component 2: Self-Attention(12:27)
Tokenization: Words to Token Ids (14:59)
Embedding: Breathe meaning into tokens (19:42)
Projecting the Output: Turning Computation into Language (24:11)
Final Note: Visualizing Probabilities (25:51)

The Illustrated Transformer:
https://jalammar.github.io/ill....ustrated-transformer

Simple transformer language model notebook:
https://github.com/jalammar/ja....lammar.github.io/blo

Philosophers On GPT-3 (updated with replies by GPT-3):
https://dailynous.com/2020/07/....30/philosophers-gpt-
-----


Twitter: https://twitter.com/JayAlammar
Blog: https://jalammar.github.io/
Mailing List: http://eepurl.com/gl0BHL


More videos by Jay:
Jay's Visual Intro to AI
https://www.youtube.com/watch?v=mSTCzNgDJy4


How GPT-3 Works - Easily Explained with Animations
https://www.youtube.com/watch?v=MQnJZuBGmSQ

Generative AI
2,213,143 Views · 3 years ago

Why didn't OpenAI release their "Unicorn" GPT2 large transformer? Rob Miles suggests why it might not just be a a PR stunt.

Unicorn AI: https://youtu.be/89A4jGvaaKk
Unicorn AI (More Examples): https://youtu.be/p-6F4rhRYLQ
Generative Adversarial Networks (GANs): https://www.youtube.com/watch?v=5oXyibEgJr0

More from Rob Miles: http://bit.ly/Rob_Miles_YouTube

Thanks to Nottingham Hackspace for providing the filming location: http://bit.ly/notthack

https://www.facebook.com/computerphile
https://twitter.com/computer_phile

This video was filmed and edited by Sean Riley.

Computer Science at the University of Nottingham: https://bit.ly/nottscomputer

Computerphile is a sister project to Brady Haran's Numberphile. More at http://www.bradyharan.com

Generative AI
2,691,852 Views · 3 years ago

Comparing the recently released GPT-J-6B from Eleuther and Curie from Open AI on a few supposedly simple prompts.
__________

My courses:

Neural Networks with Tensorflow: http://bit.ly/tensorflownets
Machine Learning with Scikit-Learn and Python: http://bit.ly/mlpractical
Artificial Neural Nets with Neurolab: https://bit.ly/artificialnets

Use AI to enhance your work: http://bit.ly/aitotherescue

Training and coaching and contract work:

https://cristivlad.com/contact

Connect with me:

Linkedin: https://www.linkedin.com/in/cristivlad/
Twitter: @cristivlad25

Support this channel:

https://www.buymeacoffee.com/cristivlad
https://www.patreon.com/cristivlad

Coupons:

Paperspace credit: https://paperspace.io/&R=FMXH1BN
DigitalOcean credit: https://m.do.co/c/efe4365e60bd
Use AI to enhance your work: http://bit.ly/aitotherescue
__________

This video is for educational purposes only.
__________

At the time of publishing the video, all non-original content has been appropriately cited and referenced.

However, if at any later time copyright issues or other issues might arise, please reach me out first before filing a claim with YouTube and I will gladly work towards solving these issues, whatever they may be.

Contact details are on the channel's "About" page!
Please consider "fair use" before filing a claim. Thank You!

Generative AI
2,739,817 Views · 3 years ago

Thanks to ML6 for virtually hosting us tonight! For those who would like to attend live, have a look at https://www.meetup.com/ai-camp....us-berlin/events/282

Talk 1

Title: How we trained our own Dutch GPT-2 using Transformers

Speaker: Thomas Vrancken

Abstract:
Text Generation and the GPT series of Transformer models have been a hot topic since the public got to know the astounding power of it. The latest GPT-3 can mimic a human conversation to a close to scary level.
At ML6 we trained and open sourced our own Dutch GPT-2 model using Huggingface’s Transformers library. This talk addresses the questions:
How do you do that? What kind of data do you need and how to access enough compute power to actually train the model?

Bio:
Thomas Vrancken is an ML Engineer at ML6 with a background in strategy consulting and research. Thomas is passionate about NLP, data science and making real world impacts with creative Machine Learning applications. Staying versatile is his credo, joining a maximum of events with interesting talks is a means of achieving it.

Talk 2

Title: Efficient Transformers

Speaker: Mats Uytterhoeven

Abstract:
In recent years we’ve seen an exponential increase in the size of pre-trained transformer based models and although they push the state-of-the-art to ever greater heights, they also become increasingly cumbersome to work with. This has prompted researchers around the world to try and find more efficient alternatives to the classic transformer architecture and has spawned an interesting new research direction. In this talk, we will have a look at some of the interesting ideas in this area and what the future may hold for these transformer based models.

Bio:
Mats Uytterhoeven is an ML Engineer at ML6 interested in a broad range of topics. His main focus is on NLP and unsupervised learning problems. When he's not hacking on machine learning code, he likes playing tennis, reading (non-fiction), and traveling. He believes machine learning can have a positive impact on people's lives and loves working on projects that can make a difference.

Generative AI
2,331,087 Views · 3 years ago

You’ll learn how ChatGPT works and this will provide many benefits, such as helping you to use the model more effectively, evaluate its outputs more critically, and staying informed about the latest developments in the field so you are better prepared to take advantage of new opportunities.

ChatGPT is a type of natural language processing model (NLP) known as a Generative Pretrained Transformer (GPT) developed by OpenAI. These are the two big terms we will focus on in this video. On top of that you will also get a base understanding of common Machine Learning techniques like supervised learning, and reinforcement learning, which were used to make ChatGPT as good as it is.

Use ChatGPT here: https://chat.openai.com/

Follow me on Twitter: https://twitter.com/bPGTill
My Lightning Address: ⚡️till@getalby.com

My Discord server: https://discord.gg/e5KXwadq4s
Instagram: https://www.instagram.com/tillmusshoff/

Video on my second channel about building a blog website with ChatGPT: https://youtu.be/Z6IYxbMdOms

Further Sources:
Scale AI, OpenAI's Greg Brockman: The Future of LLMs, Foundation & Generative Models (DALL·E 2 & GPT-3): https://youtu.be/Rp3A5q9L_bg

Generative AI
2,298,380 Views · 3 years ago

Also Checkout my 2nd Channel ( on Trading, Crypto & Investments ) - https://www.youtube.com/channe....l/UChMwVQBFtaOga5Mh0

I am a Banker turned Trader & Machine Learning Engineer | Kaggle Master. Follow me on 🐦 TWITTER: https://twitter.com/rohanpaul_ai - for daily tips on Trading, Crypto & Machine Learning.

Link to Hugging Face Model - https://huggingface.co/EleutherAI/gpt-neo-2.7B

----------------

🚀🔬 Checkout my Generative Adversarial Network (GAN) video course in Gumroad -

7.5 Hours of Course - 6 different GAN Architecture implementations from scratch with #PyTorch

🚀🔬 https://bit.ly/3Ozcf3y

You can find me here:

**********************************************
🐦 TWITTER: https://twitter.com/rohanpaul_ai
👨‍🏫 Udemy: https://www.udemy.com/user/rohan-paul/
🧑🏼‍💻 Gumroad : https://rohanpaul.gumroad.com/
🎨 Redbubble: https://rdbl.co/3NMmJLU
👨‍🔧 Kaggle: https://www.kaggle.com/paulrohan2020
👨🏻‍💼 LINKEDIN: https://www.linkedin.com/in/rohan-paul-b27285129/
👨‍💻 GITHUB: https://github.com/rohan-paul
🧑‍🦰 Facebook Page: https://www.facebook.com/rohanpaulai/
📸 Instagram: https://www.instagram.com/rohan_paul_2020/

**********************************************


Other Playlist you might like 👇

🟠 MachineLearning & DeepLearning Concepts & interview Question Playlist - https://bit.ly/380eYDj

🟠 ComputerVision / DeepLearning Algorithms Implementation Playlist - https://bit.ly/36jEvpI

🟠 DataScience | MachineLearning Projects Implementation Playlist - https://bit.ly/39MEigt

🟠 Natural Language Processing Playlist : https://bit.ly/3P6r2CL



#machinelearning #datascience #nlp #textprocessing #kaggle #tensorflow #pytorch #deeplearning #deeplearningai #100daysofmlcode #neuralnetworks #pythonprogramming #python #100DaysOfMLCode




Showing 202 out of 579