Top videos

Generative AI
2,740,802 Views · 3 years ago

Graph machine learning has become very popular in recent years in the machine learning and engineering communities. In this video, we explore the math behind some of the most popular graph neural network algorithms!

Support the channel by liking, commenting, subscribing, and recommend this video to your friends, coworkers, or colleagues if you think they'll find this video valuable!

Other Videos in this Series
Why use graphs for machine learning? https://youtu.be/mu1Inz3ltlo
Intro to graph neural networks https://youtu.be/cka4Fa4TTI4
Spatio-Temporal Graph Neural Networks https://youtu.be/RRMU8kJH60Q

My Links:
Youtube: https://www.youtube.com/channe....l/UCpXbaIslF2ZKeJ6rW
Twitter: https://twitter.com/jhanytime
Reddit: https://old.reddit.com/user/jhanytime/

Other Links:
Slides: https://drive.google.com/file/....d/1C9rRQ46kfFVQUNrg8

Generative AI
2,740,659 Views · 3 years ago

In this course we implement the most popular Machine Learning algorithms from scratch using only Python and NumPy.

Get my Free NumPy Handbook:
https://www.python-engineer.com/numpybook

✅ Write cleaner code with Sourcery, instant refactoring suggestions in VS Code & PyCharm: https://sourcery.ai/?utm_source=youtube&utm_campaign=pythonengineer *

⭐ Join Our Discord : https://discord.gg/FHMg9tKFSN

📓 Notebooks available on Patreon:
https://www.patreon.com/patrickloeber

If you enjoyed this video, please subscribe to the channel!

Code:
https://github.com/patrickloeber/MLfromscratch

Timeline:
00:00​ - Introduction
00:56​ - 1 KNN
23:00​ - 2 Linear Regression
43:38​ - 3 Logistic Regression
1:00:50​ - 4 Regression Refactoring
1:08:25​ - 5 Naive Bayes
1:29:11​ - 6 Perceptron
1:47:01​ - 7 SVM
2:06:33​ - 8 Decision Tree Part 1
2:17:12​ - 9 Decision Tree Part 2
2:48:00​ - 10 Random Forest
3:01:22​ - 11 PCA
3:18:43​ - 12 K-Means
3:48:15​ - 13 AdaBoost
4:15:53​ - 14 LDA
4:38:10​ - 15 Load Data From CSV

You can find me here:
Website: https://www.python-engineer.com
Twitter: https://twitter.com/patloeber
GitHub: https://github.com/patrickloeber

#Python #MachineLearning

----------------------------------------------------------------------------------------------------------
* This is a sponsored or an affiliate link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

Generative AI
2,739,921 Views · 3 years ago

All rights owned by Columbia Pictures

Generative AI
2,739,816 Views · 3 years ago

Thanks to ML6 for virtually hosting us tonight! For those who would like to attend live, have a look at https://www.meetup.com/ai-camp....us-berlin/events/282

Talk 1

Title: How we trained our own Dutch GPT-2 using Transformers

Speaker: Thomas Vrancken

Abstract:
Text Generation and the GPT series of Transformer models have been a hot topic since the public got to know the astounding power of it. The latest GPT-3 can mimic a human conversation to a close to scary level.
At ML6 we trained and open sourced our own Dutch GPT-2 model using Huggingface’s Transformers library. This talk addresses the questions:
How do you do that? What kind of data do you need and how to access enough compute power to actually train the model?

Bio:
Thomas Vrancken is an ML Engineer at ML6 with a background in strategy consulting and research. Thomas is passionate about NLP, data science and making real world impacts with creative Machine Learning applications. Staying versatile is his credo, joining a maximum of events with interesting talks is a means of achieving it.

Talk 2

Title: Efficient Transformers

Speaker: Mats Uytterhoeven

Abstract:
In recent years we’ve seen an exponential increase in the size of pre-trained transformer based models and although they push the state-of-the-art to ever greater heights, they also become increasingly cumbersome to work with. This has prompted researchers around the world to try and find more efficient alternatives to the classic transformer architecture and has spawned an interesting new research direction. In this talk, we will have a look at some of the interesting ideas in this area and what the future may hold for these transformer based models.

Bio:
Mats Uytterhoeven is an ML Engineer at ML6 interested in a broad range of topics. His main focus is on NLP and unsupervised learning problems. When he's not hacking on machine learning code, he likes playing tennis, reading (non-fiction), and traveling. He believes machine learning can have a positive impact on people's lives and loves working on projects that can make a difference.

Data Analytics
2,739,110 Views · 3 years ago

Generative AI
2,738,259 Views · 3 years ago

StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation

Course Materials: https://github.com/maziarraiss....i/Applied-Deep-Learn




Showing 108 out of 457