Top videos

Generative AI
2,756,977 Views · 3 years ago

Patreon: https://www.patreon.com/mlst
Discord: https://discord.gg/ESrGqhf5CB

"Symmetry, as wide or narrow as you may define its meaning, is one idea by which man through the ages has tried to comprehend and create order, beauty, and perfection." and that was a quote from Hermann Weyl, a German mathematician who was born in the late 19th century.

The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods. Many high-dimensional learning tasks previously thought to be beyond reach -- such as computer vision, playing Go, or protein folding -- are in fact tractable given enough computational horsepower. Remarkably, the essence of deep learning is built from two simple algorithmic principles: first, the notion of representation or feature learning and second, learning by local gradient-descent type methods, typically implemented as backpropagation.

While learning generic functions in high dimensions is a cursed estimation problem, many tasks are not uniform and have strong repeating patterns as a result of the low-dimensionality and structure of the physical world.

Geometric Deep Learning unifies a broad class of ML problems from the perspectives of symmetry and invariance. These principles not only underlie the breakthrough performance of convolutional neural networks and the recent success of graph neural networks but also provide a principled way to construct new types of problem-specific inductive biases.

This week we spoke with Professor Michael Bronstein (head of graph ML at Twitter) and Dr.
Petar Veličković (Senior Research Scientist at DeepMind), and Dr. Taco Cohen and Prof. Joan Bruna about their new proto-book Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges.

Enjoy the show!

Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges
https://arxiv.org/abs/2104.13478

[00:00:00] Tim Intro
[00:01:55] Fabian Fuchs article
[00:04:05] High dimensional learning and curse
[00:05:33] Inductive priors
[00:07:55] The proto book
[00:09:37] The domains of geometric deep learning
[00:10:03] Symmetries
[00:12:03] The blueprint
[00:13:30] NNs don't deal with network structure (TedX)
[00:14:26] Penrose - standing edition
[00:15:29] Past decade revolution (ICLR)
[00:16:34] Talking about the blueprint
[00:17:11] Interpolated nature of DL / intelligence
[00:21:29] Going tack to Euclid
[00:22:42] Erlangen program
[00:24:56] “How is geometric deep learning going to have an impact”
[00:26:36] Introduce Michael and Petar
[00:28:35] Petar Intro
[00:32:52] Algorithmic reasoning
[00:36:16] Thinking fast and slow (Petar)
[00:38:12] Taco Intro
[00:46:52] Deep learning is the craze now (Petar)
[00:48:38] On convolutions (Taco)
[00:53:17] Joan Bruna's voyage into geometric deep learning
[00:56:51] What is your most passionately held belief about machine learning? (Bronstein)
[00:57:57] Is the function approximation theorem still useful? (Bruna)
[01:11:52] Could an NN learn a sorting algorithm efficiently (Bruna)
[01:17:08] Curse of dimensionality / manifold hypothesis (Bronstein)
[01:25:17] Will we ever understand approximation of deep neural networks (Bruna)
[01:29:01] Can NNs extrapolate outside of the training data? (Bruna)
[01:31:21] What areas of math are needed for geometric deep learning? (Bruna)
[01:32:18] Graphs are really useful for representing most natural data (Petar)
[01:35:09] What was your biggest aha moment early (Bronstein)
[01:39:04] What gets you most excited? (Bronstein)
[01:39:46] Main show kick off + Conservation laws
[01:49:10] Graphs are king
[01:52:44] Vector spaces vs discrete
[02:00:08] Does language have a geometry? Which domains can geometry not be applied? +Category theory
[02:04:21] Abstract categories in language from graph learning
[02:07:10] Reasoning and extrapolation in knowledge graphs
[02:15:36] Transformers are graph neural networks?
[02:21:31] Tim never liked positional embeddings
[02:24:13] Is the case for invariance overblown? Could they actually be harmful?
[02:31:24] Why is geometry a good prior?
[02:34:28] Augmentations vs architecture and on learning approximate invariance
[02:37:04] Data augmentation vs symmetries (Taco)
[02:40:37] Could symmetries be harmful (Taco)
[02:47:43] Discovering group structure (from Yannic)
[02:49:36] Are fractals a good analogy for physical reality?
[02:52:50] Is physical reality high dimensional or not?
[02:54:30] Heuristics which deal with permutation blowups in GNNs
[02:59:46] Practical blueprint of building a geometric network architecture
[03:01:50] Symmetry discovering procedures
[03:04:05] How could real world data scientists benefit from geometric DL?
[03:07:17] Most important problem to solve in message passing in GNNs
[03:09:09] Better RL sample efficiency as a result of geometric DL (XLVIN paper)
[03:14:02] Geometric DL helping latent graph learning
[03:17:07] On intelligence
[03:23:52] Convolutions on irregular objects (Taco)

Generative AI
2,756,208 Views · 3 years ago

The Caribbean is one of the world's most beautiful regions. Enjoy this 4k Scenic Relaxation Film featuring over 25 Caribbean islands and countries. From the beautiful coast of Barbados, to the jagged Pitons of St. Lucia, the Caribbean is home to some of the world's most beautiful places. Where is your favorite Caribbean destination?

Our other Relaxation films:

Europe 4K - https://youtu.be/0xhzwDXfLds

Fiji 4K - https://youtu.be/zleIaEIBs2M

French Polynesia 4K - https://youtu.be/YbUkAJHCgd0

Switzerland 4K - https://youtu.be/fyOVKyaKJq4

Norway 4K - https://youtu.be/Bxo2JkiqG_o

Ireland 4K - https://youtu.be/dR-BW-alOF4

Portugal 4K - https://youtu.be/AJV6uXGu70Y

France 4K - https://youtu.be/tHztN9inrOw

Italy 4K - https://youtu.be/H4tyzzP33Cw

Scotland 4K - https://youtu.be/gGCwgCe3WtQ

Germany 4K - https://youtu.be/6K0sajMdAnk

Croatia 4K - https://youtu.be/l8vnE91bV0U

The Alps 4K - https://youtu.be/BTMjD7_evjE

Mediterranean 4K - https://youtu.be/fTn-XfauCPI

Follow us on instagram @scenicrelaxationfilms

Where we get our music - https://fm.pxf.io/gbYAm9
Great Place for Stock footage - https://bit.ly/38b1EJH
Great Place for Music - https://bit.ly/3GptQHd
Great Place for Assets - https://bit.ly/3K59ZPK
Free stock footage, guides & luts - https://sellfy.com/ryanshirley

Timestamps:

0:00 - Flying over the Caribbean
7:00 - Martinique
10:01 - St. Lucia
13:03 - Guadeloupe
16:00 - Dominican Republic
18:45 - Puerto Rico
20:38 - Barbados
22:18 - Virgin Islands
23:06 - St. Kitts & Antigua
25:01 - Jamaica
26:54 - Bahamas
28:57 - Venezuela
30:35 - Colombia
32:08 - Mexico
34:05 - Belize
34:42 - Honduras & Panama
35:41 - Curaçao
37:00 - Trinidad & Tobago
38:41 - Animals of the Caribbean
41:32 - Florida Keys & Grand Cayman
42:55 - Cuba
43:26 - Caribbean Landscapes

Thanks for watching :)

Generative AI
2,754,708 Views · 3 years ago

🔥Intellipaat AI Deep Learning course: https://intellipaat.com/artifi....cial-intelligence-ma
In this deep learning interview questions and answers you will learn the latest and top questions asked by companies for deep learning interview. This deep learning interview questions & answers video covers all kinds of questions starting from basic to advanced questions so that you can get benefited.
#DeepLearningInterviewQuestionsAndAnswers #AIandDeepLearningInterviewQuestions #DeepLearningInterviewQuestions #DeepLearningInterview #DeepLearning #MachineLearning #AIInterviewQuestions

📌 Do subscribe to Intellipaat channel & get regular updates on videos: http://bit.ly/Intellipaat

🔗 Watch AI video tutorials here: http://bit.ly/2F1Bhqt

📕 Read complete AI tutorial here: https://intellipaat.com/blog/t....utorial/artificial-i

📝Following questions are covered in this deep learning video:
00:00 - Deep Learning Interview Questions And Answers
00:59 - What is the Difference between Machine Learning and Deep Learning?
02:16 - What is Perceptron?
03:26 - How is Deep Learning better than Machine Learning?
04:29 - What are some of the most used applications of Deep Learning?
05:27 - What is the meaning of Over fitting?
06:47 - What are Activation functions?
08:00 - Why is Fourier transform used in Deep Learning?
08:55 - What are the steps involved in training a perceptron in Deep learning?
09:47 - What is the use of the loss function?
10:30 - What are some of the Deep Learning Frameworks or tools that you have used?
11:49 - What is the use of the swish function?
12:38 - What are auto encoders?
13:41 - What are the steps to be followed to use the gradient descent algorithm?
14:57 - Differentiate between a single layer perceptron and a multi-layer perceptron
16:00 - What is data normalization in Deep Learning?
16:54 - What is forward propagation?
17:44 - What is back propagation?
18:40 - What are Hyper parameters in Deep Learning?
19:19 - How can hyper parameters be trained in neural networks?
21:38 - What is the meaning of dropout in Deep Learning?
22:42 - What are Tensors?
23:44 - What is the meaning of model capacity in Deep Learning?
24:33 - What is Boltzmann Machine?
25:25 - What are some of the advantages of using TensorFlow?
26:27 - What is the computational graph in Deep Learning?
27:40 - What is a CNN?
28:25 - What are the various layers present in a CNN?
30:19 - What is an RNN in Deep Learning?
31:15 - What is a Vanishing gradient when using RNNs?
32:11 - What is exploding gradient descent in Deep Learning?
33:10 - What is the use of LSTM?
34:04 - Where are autoencoders used?
35:05 - What are the types of auto encoders?
35:35 - What is a restricted Boltzmann Machine?
36:30 - What are some of the limitations of Deep Learning?
38:04 - What are the variants of gradient descent?
39:33 - Why is mini-batch gradient descent so popular?
40:35 - What are deep autoencoders?
41:47 - Why is the leaky ReLu function used in Deep Learning?
42:35 - What are some of the examples if the supervised learning algorithms in Deep Learning?
43:25 - What are some of the examples of unsupervised learning algorithms in Deep Learning?
43:56 - Can we initialize the weights of a network to start from zero?
45:00 - What is the meaning of valid padding and same padding in CNN?
46:16 - What are some of the applications of transfer learning in Deep Learning?
47:25 - How is the transformer architecture better than RNNs in Deep Learning?
48:41 - What are the steps involved in the working of an LSTM network?
50:07 - What are the elements in TensorFlow that are programmable?
50:43 - What is the meaning of bagging and boosting in Deep Learning?
51:52 - What are generative adversarial networks (GANs)?
53:00 - Have you earned any sort of Certification to improve your learning and implementation process?
----------------------------
Intellipaat Edge
1. 24*7 Life time Access & Support
2. Flexible Class Schedule
3. Job Assistance
4. Mentors with +14 yrs
5. Industry Oriented Course ware
6. Life time free Course Upgrade
------------------------------
Why should you opt for an Artificial Intelligence career?

If you want to fast-track your career then you should strongly consider Artificial Intelligence. The reason for this is that it is one of the fastest growing technology. There is a huge demand for professionals in Artificial Intelligence. The salaries for A.I. Professionals is fantastic.There is a huge growth opportunity in this domain as well.
------------------------------

For more Information:
Please write us to sales@intellipaat.com, or call us at: +91- 7847955955, US : 1-800-216-8930(Toll Free)

Website: https://intellipaat.com/artifi....cial-intelligence-ma

Facebook: https://www.facebook.com/intellipaatonline

LinkedIn: https://www.linkedin.com/compa....ny/intellipaat-softw

Twitter: https://twitter.com/Intellipaat

Generative AI
2,754,423 Views · 3 years ago

In many scenarios you need to create more custom training than what model.fit supports and what's great is that we can actually modify how a training step is done inside model.fit and increase the flexibility of training models. In this video I show how this can be done and add more flexibility, then in the next video I will show you how to build custom training loops completely from scratch for scenarios where even more flexibility is needed.

I learned a lot and was inspired to make these TensorFlow videos by the TensorFlow Specialization on Coursera. Below you'll find both affiliate and non-affiliate links, the pricing for you is the same but a small commission goes back to the channel if you buy it through the affiliate link.
affiliate: https://bit.ly/3t3tgI5
non-affiliate: https://bit.ly/3kZgN5B

GitHub Repository:
https://github.com/aladdinpers....son/Machine-Learning

✅ Equipment I use and recommend:
https://www.amazon.com/shop/aladdinpersson

❤️ Become a Channel Member:
https://www.youtube.com/channe....l/UCkzW5JSFwvKRjXABI

✅ One-Time Donations:
Paypal: https://bit.ly/3buoRYH
Ethereum: 0xc84008f43d2E0bC01d925CC35915CdE92c2e99dc

▶️ You Can Connect with me on:
Twitter - https://twitter.com/aladdinpersson
LinkedIn - https://www.linkedin.com/in/al....addin-persson-a95384
GitHub - https://github.com/aladdinpersson

TensorFlow Playlist:
https://www.youtube.com/playli....st?list=PLhhyoLH6Ijf




Showing 103 out of 457