Id
int64 1
2.5k
| English
stringlengths 5
630
| Burmese
stringlengths 5
585
|
|---|---|---|
1
|
Welcome to the Hugging Face Course.
|
Hugging Face áááşáááşá¸ááž ááźááŻáááŻááŤáááşá
|
2
|
This course has been designed to teach you all about the Hugging Face ecosystem, how to use the dataset and model hub as well as all our open-source libraries.
|
ááŽáááşáááşá¸ááᯠHugging Face áá˛áˇ ááąáá
áá
áş áĄááźáąáŹááşá¸á dataset áá˛áˇ model hub áá˝áąááᯠáááşáááŻáĄááŻáśá¸ááźáŻáááá˛á ááťá˝ááşáŻááşáááŻáˇáá˛áˇ open-source library áá˝áąáĄáŹá¸ááŻáśá¸ááᯠáááşáááŻááŻáśá¸áááá˛áááŻááŹáá˝áąááᯠáááşááźáŹá¸ááąá¸áááŻáˇ ááąá¸áá˝á˛ááŹá¸ááŹááŤá
|
3
|
Here is the Table of Contents.
|
ááŽáážáŹááąáŹáˇ áááşáááşá¸áá˛áˇ áĄááźáąáŹááşá¸áĄááŹááťáŹá¸ áĄáá˝ážááşá¸ ááźá
áşááŤáááşá
|
4
|
As you can see, it's divided in three sections which become progressively more advanced.
|
áááşááźááşáá˛áˇáĄáááŻááşá¸ ááŽáááşáááşá¸ááᯠáĄáááŻááşá¸ááŻáśá¸áááŻááşá¸ áá˝á˛ááźáŹá¸ááŹá¸ááźáŽá¸ áá
áşáááŻááşá¸ááźáŽá¸áá
áşáááŻááşá¸ áááŻáááŻáĄáááˇáşááźááˇáşááŹáážáŹ ááźá
áşááŤáááşá
|
5
|
At this stage, the first two sections have been released.
|
ááŽáĄáááˇáşáážáŹááąáŹáˇ áááááŻáśá¸ áĄáááŻááşá¸áážá
áşáááŻááşá¸ááᯠááŻááşááźááşááźáŽá¸ááŤááźáŽá
|
6
|
So first, we'll teach you the basics of how to use a Transformer model, fine-tune it on your own data set and share the result with the community.
|
ááááŚá¸á
á˝áŹ Transformer model ááᯠáááşáááŻáĄááŻáśá¸ááźáŻááááşá áááŻááşáááŻááş data set ááąáŤáşáážáŹ áááşááᯠfine-tune ááŻááşááááşá ááźáŽá¸ááąáŹáˇ ááááşááᯠáĄáááŻááşá¸áĄáááŻááşá¸áá˛áˇ áááşáááŻááťážááąááááşáááŻáá˛áˇ áĄááźáąááśáá˝áąááᯠáááşááźáŹá¸ááąá¸ááŤáááşá
|
7
|
So second, we'll dive deeper into our libraries and teach you how to tackle any NLP task.
|
ááŻááááĄááąáá˛áˇ ááťá˝ááşáŻááşáááŻáˇáá˛áˇ library áá˝áąáá˛ááᯠáááŻáááŻáááşáááşáá˛áᲠááąáˇááŹááźáŽá¸ áááşáááˇáş NLP (áááŹáááŹááŹá
ááŹá¸ á
áŽááśááąáŹááşáá˝ááşááźááşá¸) ááŻááşáááşá¸áááŻáááᯠááźáąáážááşá¸áááŻááşáááˇáş áááşá¸áááşá¸áá˝áąááᯠáááşááąá¸ááŤáááşá
|
8
|
We're actively working on the last one and hope to have it ready for you for the spring of 2022.
|
ááąáŹááşááŻáśá¸áĄáááŻááşá¸áááŻááąáŹáˇ ááťá˝ááşáŻááşáááŻáˇ áááşááźá˝á
á˝áŹ ááąáŹááşáá˝ááşááąááźáŽá¸ áááá ááŻáážá
áş áá˝áąáŚá¸ááŹááŽáážáŹ áĄáááˇáşááźá
áşáĄáąáŹááş ááŻááşááąá¸áááŻááşááááˇáşáááşáááŻáˇ ááťážáąáŹáşáááˇáşááŤáááşá
|
9
|
The first chapter requires no technical knowledge and is a good introduction to learn what Transformers models can do and how it could be of use to you or your company.
|
ááááĄáááşá¸áĄáá˝ááşááąáŹáˇ áááşá¸áááŹáááŻááşá¸áááŻááşáᏠáĄááááᏠááááŻáĄááşááŤáá°á¸á Transformer ááąáŹáşáááşáá˝áą ááŹáá˝áąááŻááşáááŻááşáááşá ááŤáá˝áąá áááş ááŤáážáááŻááş áááˇáşááŻáášáááŽáĄáá˝ááş áááşáááŻáĄááŻáśá¸áááşáááŻááşáááşáááŻáᏠááąáˇááŹáááŻáˇáĄáá˝ááş ááąáŹááşá¸áá˝ááşáá˛áˇ ááááŤááşá¸áá
áşáᯠááźá
áşááŤáááşá
|
10
|
The next chapters require a good knowledge of Python and some basic knowledge of Machine Learning and Deep Learning.
|
ááąáŹááşáĄáááşá¸áá˝áąáĄáá˝ááşááąáŹáˇ Python ááᯠááąáŹááşá¸ááąáŹááşá¸áááááŻáˇáá˛áˇ Machine Learningá Deep Learning áááŻáˇáá˛áˇ áĄááźáąááśáĄáááááŹáĄááťááŻáˇ áááŻáĄááşááŤáááşá
|
11
|
If you don't know what a training and validation set are or what gradient descent means, you should look at an introductory course such as the ones published by deeplearning.ai or fast.ai.
|
áĄáááşá training set áá˛áˇ validation set áááŻááŹááŹáá˛á ááŤáážáááŻááş gradient descent áááŻáᏠááŹáááŻáááŻáááŻááá˛áááŻáᏠááááá°á¸áááŻáááş deeplearning.ai ááŤáážáááŻááş fast.ai áááŻáˇá ááŻááşááąááŹá¸áá˛áˇ ááááŤááşá¸áááşáááşá¸áá˝áąáááŻááťááŻá¸ áááşáááşá¸áá˝áąááᯠááźááˇáşáááˇáşááŤáááşá
|
12
|
It's also best if you have some basics in one Deep Learning Framework, PyTorch or TensorFlow.
|
Deep Learning Framework áá
áşááŻááŻááźá
áşáá˛áˇ PyTorch ááŤáážáááŻááş TensorFlow áážáŹ áĄááźáąááśáĄááťááŻáˇ áážáááŹá¸áááşáááŻáááş áááŻááąáŹááşá¸ááŤáááşá
|
13
|
Each part of the material introduced in this course has a version in both those frameworks, so you will be able to pick the one you are most comfortable with.
|
ááŽáááşáááşá¸áážáŹ ááááşáááşááąá¸ááŹá¸áá˛áˇ áĄááźáąáŹááşá¸áĄáᏠáĄá
áááşáĄáááŻááşá¸áááŻááşá¸áážáŹ áĄá˛áᎠframework áážá
áşááŻááŻáśá¸áĄáá˝ááş version áá˝áą áážáááŤáááşá ááŤááźáąáŹááˇáş áááşáĄááťá˝ááşá¸ááťááşááŻáśá¸ááźá
áşáá˛áˇ áá
áşááŻááᯠáá˝áąá¸ááťááşáááŻááşááŤááááˇáşáááşá
|
14
|
This is the team that developed this course.
|
ááŤáááąáŹáˇ ááŽáááşáááşá¸ááᯠááąá¸áá˝á˛áá˛áˇáá˛áˇ áĄáá˝á˛áˇáᲠááźá
áşááŤáááşá
|
15
|
I'll now let each of the speakers introduce themselves briefly.
|
áĄááŻáááŻáááş á
ááŹá¸ááźáąáŹáááˇáşáá° áá
áşáŚá¸ááťááşá¸á
áŽááᯠáá°áááŻáˇáááŻááşáá°áááŻáˇ áĄáááŻááťáŻáśá¸ ááááşáááşááąá¸áááŻáˇ áá˝ááˇáşááźáŻááŤáááşá
|
16
|
Hi, my name is Matthew, and I'm a Machine Learning Engineer at Hugging Face.
|
áááşášáááŹááŤá ááťá˝ááşááąáŹáˇáşááŹáááş Matthew ááźá
áşááźáŽá¸ Hugging Face áážáŹ á
ááşáááşáá°áážáŻ áĄááşááťááşááŽááŹáá
áşáŚá¸ááŤá
|
17
|
I work on the open-source team and I'm responsible for maintaining particularly the TensorFlow code there.
|
ááťá˝ááşááąáŹáşáᏠopen-source áĄáá˝á˛áˇáážáŹ ááŻááşáááŻááşááźáŽá¸ áĄáá°á¸áááźááˇáş TensorFlow code áá˝áąááᯠááááşá¸ááááşá¸á
áąáŹááˇáşáážáąáŹááşáááŻáˇ ááŹáááşáá°ááŹá¸ááŤáááşá
|
18
|
Previously, I was a Machine Learning Engineer at Parsley, who've recently been acquired by Automatic, and I was a postdoctoral researcher before that at Trinity College, Dublin in Ireland working on computational genetics and retinal disease.
|
áĄáááşááŻááşá¸áááąáŹáˇ Automatic á áááźáŹááąá¸áááşáááž áááşáá°áá˛áˇáá˛áˇ Parsley áážáŹ Machine Learning Engineer áĄááźá
áş ááŻááşáá˛áˇáá°á¸ááŤáááşá áĄá˛ááŽáˇááááŻááşáááşá áĄááŻááşááŹáááşáááŻááşááśá Dublin áážá Trinity College áážáŹ ááťááşááźááşáá˝ážáŹááąáŹááŤáá˛áˇ áá˝ááşááťá°áᏠááťááŻá¸áááŻá¸ááŽááááŻááşáᏠááŻááąáááá˝áą ááŻááşáá˛áˇáá˛áˇ postdoctoral researcher áá
áşáŚá¸ ááźá
áşáá˛áˇááŤáááşá
|
19
|
Hi, I'm Lysandre.
|
áááşášáááŹááŤá ááťá˝ááşááąáŹáş Lysandre ááŤá
|
20
|
I'm a Machine Learning Engineer at Hugging Face and I'm specifically part of the open-source team.
|
ááťá˝ááşááąáŹáşá Hugging Face áážáŹ á
ááşáááşáá°áážáŻ áĄááşááťááşááŽááŹáá
áşáŚá¸ááźá
áşááźáŽá¸ open-source áĄáá˝á˛áˇáážáŹ áĄáá°á¸ááŤáááşáá°ááŤá
|
21
|
I've been at Hugging Face for a few years now and alongside my team members, I've been working on most of the tools that you'll get to see in this course.
|
ááťá˝ááşááąáŹáş Hugging Face áážáŹ áĄááŻáááŻáááş áážá
áşáĄáááşá¸áááşááźáŹááźáŽááźá
áşááźáŽá¸ ááťá˝ááşááąáŹáˇáşáá˛áˇ áĄáá˝á˛áˇááŹá¸áá˝áąáá˛áˇáĄáá° ááŽáááşáááşá¸áážáŹ áááşáááŻáˇáá˝áąáˇááźááşááááˇáş ááááááŹáĄááťáŹá¸á
áŻááᯠááŽáá˝ááşáááşááŽá¸áá˛áˇááŤáááşá
|
22
|
Hi, I'm Sylvain.
|
áááşášáááŹááŤá ááťá˝ááşááąáŹáş Sylvain ááŤá
|
23
|
I'm a Research Engineer at Hugging Face and one of the main maintainers of the Transformers Library.
|
ááťá˝ááşááąáŹáşá Hugging Face áážáŹ ááŻááąáá áĄááşááťááşááŽááŹáá
áşáŚá¸ááźá
áşááźáŽá¸ Transformers Library áá˛áˇ áĄááá ááááşá¸ááááşá¸áá°áá˝áąáá˛á áá
áşááąáŹááşááŤá
|
24
|
Previously, I worked at fast.ai where I helped develop the fast.ai Library as well as the online book.
|
áĄáááşááŻááşá¸á fast.ai áážáŹ áĄááŻááşááŻááşáá˛áˇááźáŽá¸ fast.ai Library áá˛áˇ online á
áŹáĄáŻááşááᯠááŽáá˝ááşááŹáážáŹ áá°ááŽáá˛áˇááŤáááşá
|
25
|
Before that, I was a math and computer science teacher in France.
|
áĄá˛ááŽáˇááááŻááşáááşá ááźááşáá
áşáážáŹ áááşášááťáŹáá˛áˇ áá˝ááşááťá°ááŹáááášááśáááŹáá
áşáŚá¸ ááźá
áşáá˛áˇááŤáááşá
|
26
|
Hi, my name is Sasha and I'm a Researcher at Hugging Face, working on the ethical, environmental and social impacts of machine learning models.
|
áááşášáááŹááŤá ááťá˝ááşááąáŹáˇáşááŹáááş Sasha ááźá
áşááźáŽá¸ Hugging Face áážáŹ ááŻááąááŽáá
áşáŚá¸ááŤá á
ááşáááşáá°áážáŻ ááąáŹáşáááşáá˝áąáá˛áˇ ááťááˇáşáááşá áááŹááááşáááşá¸ááťááşáá˛áˇ áá°áážáŻááąá¸áááŻááşáᏠáááşááąáŹááşáážáŻáá˝áąáĄááąáŤáş ááŻááşááąáŹááşááąááŤáááşá
|
27
|
Previously, I was a postdoctoral researcher at Mila, University in Montreal and I also worked as an Applied AI Researcher for the United Nations Global Pulse.
|
áĄáááşááŻááşá¸á Montreal áážá Mila ááášááááŻááşáážáŹ postdoctoral researcher áĄááźá
áşááŻááşáá˛áˇááźáŽá¸ ááŻááááášá Global Pulse áĄáá˝ááş Applied AI Researcher áĄááźá
áşáááşá¸ ááŻááşáá˛áˇáá°á¸ááŤáááşá
|
28
|
I've been involved in projects such as CodeCarbon and the Machine Learning Impacts Calculator to measure the carbon footprint of machine learning.
|
á
ááşáááşáá°áážáŻáá˛áˇ ááŹáá˝ááşááźáąááŹááᯠáááŻááşá¸ááŹáááŻáˇ CodeCarbon áá˛áˇ Machine Learning Impacts Calculator áááŻááťááŻá¸ áááąáŹááťááşáá˝áąáážáŹ ááŤáááşáá˛áˇááŤáááşá
|
29
|
Hi, I'm Merve and I'm a Developer Advocate at Hugging Face.
|
áááşášáááŹááŤá ááťá˝ááşááąáŹáş Merve ááŤá Hugging Face áážáŹ Developer Advocate áá
áşáŚá¸ááŤá
|
30
|
Previously, I was working as a Machine Learning Engineer building NLP tools and chatbots.
|
áĄáááşá NLP ááááááŹáá˝áąáá˛áˇ chatbots áá˝áąáááşááąáŹááşáá˛áˇ á
ááşáááşáá°áážáŻ áĄááşááťááşááŽááŹáĄááźá
áş ááŻááşáá˛áˇááŤáááşá
|
31
|
Currently, I'm working to improve the hub and democratize machine learning.
|
áááşáážááážáŹááąáŹáˇ hub ááᯠáááŻá¸áááşáĄáąáŹááşááŻááşáááŻáˇáá˛áˇ á
ááşáááşáá°áážáŻááᯠáá°áááŻááşá¸ááŻáśá¸áááŻááşáĄáąáŹááş ááŻááşááąáŹááşáááŻáˇ ááźááŻá¸á
áŹá¸ááąááŤáááşá
|
32
|
Hello everyone.
|
áĄáŹá¸ááŻáśá¸áááşášáááŹááŤá
|
33
|
My name is Lucile and I'm a Machine Learning Engineer at Hugging Face.
|
ááťá˝ááşáááŹáááş Lucile ááźá
áşááźáŽá¸ Hugging Face áážáŹ á
ááşáááşáá°áážáŻ áĄááşááťááşááŽááŹááŤá
|
34
|
To tell you in two sentences who I am, I work on the development and support of open-source tools and I also participate in several research project in the field of Natural Language Processing.
|
ááťá˝ááşá áááşáá°áá˛áááŻááŹááᯠááŤááťáážá
áşááźáąáŹááşá¸áá˛áˇ ááźáąáŹááááş open-source ááááááŹáá˝áąááᯠááŽáá˝ááşááŹáá˛áˇ áĄááąáŹááşáĄááśáˇááąá¸ááŹáá˝áą ááŻááşááąáŹááşááźáŽá¸ áááŹáááŹááŹá
ááŹá¸ á
áŽááśááąáŹááşáá˝ááşááźááşá¸ (Natural Language Processing) áááşáááşá ááŻááąáááááąáŹááťááşááťáŹá¸á
á˝áŹáážáŹáááşá¸ ááŤáááşááŤáááşá
|
35
|
Good day there.
|
áááşášáááŹáážáááąáŹááąáˇááŤá
|
36
|
I'm Lewis and I'm a Machine Learning Engineer in the open-source team at Hugging Face.
|
ááťá˝ááşááąáŹáş Lewis ááŤá Hugging Face áá˛áˇ open-source áĄáá˝á˛áˇáážáŹ á
ááşáááşáá°áážáŻ áĄááşááťááşááŽááŹáá
áşáŚá¸ááŤá
|
37
|
I'm passionate about developing tools for the NLP community and you'll see me at many of Hugging Face's outreach activities.
|
NLP áĄáááŻááşá¸áĄáááŻááşá¸áĄáá˝ááş ááááááŹáá˝áą ááŽáá˝ááşáááŹááᯠááŤáááŹááŤááźáŽá¸ Hugging Face áá˛áˇ áá°ááŻáááşáá˝ááşááąá¸ áážáŻááşáážáŹá¸áážáŻááťáŹá¸á
á˝áŹáážáŹ ááťá˝ááşááąáŹáˇáşááᯠáá˝áąáˇáááŤááááˇáşáááşá
|
38
|
Before joining Hugging Face, I spent several years developing machine learning applications for startups and enterprises in the domains of NLP, topological data analysis and time series.
|
Hugging Face áážáŹ ááááşáááşá NLPá topological data analysis áá˛áˇ time series áááŻáˇááᯠáááşáááşáá˝áąáážáŹ start-up áá˝áąáá˛áˇ ááŻááşáááşá¸ááźáŽá¸áá˝áąáĄáá˝ááş á
ááşáááşáá°áážáŻ applications áá˝áąááᯠáážá
áşááąáŤááşá¸ááťáŹá¸á
á˝áŹ ááŽáá˝ááşáá˛áˇáá°á¸ááŤáááşá
|
39
|
In a former life, I was a theoretical physicist, where I researched particle collisions at the Large Hadron Collider and so.
|
áĄáááşááááŻááşá¸á ááťá˝ááşááąáŹáşá ááŽáĄááŻááŽáááŻááşáᏠáá°áááąááááŹáážááşááźá
áşááźáŽá¸ Large Hadron Collider áážáŹ áĄáážáŻááşáááŻááşáážáŻáá˝áąááᯠááŻááąááááŻááşáá˛áˇááŤáááşá
|
40
|
Hey, I'm Leandro and I'm a Machine Learning Engineer in the open-source team at Hugging Face.
|
ááąá¸á ááťá˝ááşááąáŹáş Leandro ááŤá Hugging Face áá˛áˇ open-source áĄáá˝á˛áˇáážáŹ á
ááşáááşáá°áážáŻ áĄááşááťááşááŽááŹáá
áşáŚá¸ááŤá
|
41
|
Before joining Hugging Face, I worked as a Data Scientist in Switzerland and have taught Data Science at University.
|
Hugging Face ááᯠááááşáááşá áá˝á
áşááŹáááşáážáŹ Data Scientist áĄááźá
áş ááŻááşáááŻááşáá˛áˇááźáŽá¸ ááášááááŻááşáážáŹ ááąááŹáááášááś ááᯠáááşááźáŹá¸áá˛áˇáá°á¸ááŤáááşá
|
42
|
The pipeline function.
|
pipeline ááŻááşááąáŹááşááťááşá
|
43
|
The pipeline function is the most high level API of the Transformers library.
|
pipeline ááŻááşááąáŹááşááťááşáááş Transformers library á áĄááźááˇáşááŻáśá¸ áĄáááˇáşáážá API áá
áşáᯠááźá
áşáááşá
|
44
|
It regroups together all the steps to go from raw texts to usable predictions.
|
áááşá¸áááş áá°áááşá¸á
áŹááŹá¸ááťáŹá¸ááž áĄááŻáśá¸ááźáŻáááŻááşááąáŹ áááˇáşáážááşá¸ááťááşááťáŹá¸áĄáá ááąáŹááşáážááááş áááŻáĄááşáááˇáş áĄáááˇáşáĄáŹá¸ááŻáśá¸ááᯠá
áŻá
ááşá¸ááąá¸ááŹá¸áááşá
|
45
|
The model used is at the core of a pipeline, but the pipeline also include all the necessary pre-processing, since the model does not expect texts, but number, as well as some post-processing, to make the output of the model human-readable.
|
áĄááŻáśá¸ááźáŻááŹá¸áááˇáş ááąáŹáşáááşáááş pipelineá áĄáááááźá
áşááąáŹáşáááşá¸á ááąáŹáşáááşáááş á
áŹááŹá¸ááťáŹá¸áĄá
áŹá¸ ááśááŤááşááťáŹá¸áááŻáᏠáááşááśááąáŹááźáąáŹááˇáş áááŻáĄááşáááˇáş ááźááŻáááşá
áŽááśááąáŹááşáá˝ááşáážáŻááťáŹá¸áĄáŹá¸ááŻáśá¸áážááˇáş ááąáŹáşáááşá áĄáá˝ááşááᯠáá°ááŹá¸ááťáŹá¸ áááşáážáŻááŹá¸áááşáááŻááşá
áąáááş ááąáŹááşáááşáá˝á˛ á
áŽááśááąáŹááşáá˝ááşáážáŻáĄááťááŻáˇáááŻáááşá¸ pipelineáá˝ááş ááŤáááşá
áąáááşá
|
46
|
Let's look at a first example with the sentiment analysis pipeline.
|
ááśá
áŹá¸ááťááş áá˝á˛ááźááşá¸á
áááşááźáŹááźááşá¸ pipelineááźááˇáş áááááŻáśá¸ áĽáááŹáá
áşááŻááᯠááźááˇáşááźááĄáąáŹááşá
|
47
|
This pipeline performs text classification on a given input and determines if it's positive or negative.
|
á¤pipelineáááş ááąá¸ááŹá¸ááąáŹ áááˇáşáá˝ááşá¸á
áŹááŹá¸ááąáŤáşáá˝ááş á
áŹááŹá¸áá˝á˛ááźáŹá¸áááşáážááşááźááşá¸ááᯠááŻááşááąáŹááşááźáŽá¸ áĄááźáŻáááąáŹááąáŹááşááźááşá¸ áááŻáˇáááŻááş áĄááŻááşáááąáŹááąáŹááşááźááşá¸ áážáááážá ááŻáśá¸ááźááşáááşá
|
48
|
Here, it attributed the positive label on the given text, with a confidence of 95%.
|
á¤áá˝ááşá áááşá¸áááş ááąá¸ááŹá¸ááąáŹ á
áŹááŹá¸ááąáŤáşáážáŹ áá
% ááŻáśááźááşáážáŻáážááˇáşáĄáá° áĄááźáŻáááąáŹ label ááᯠáááşáážááşááąá¸áá˛áˇáááşá
|
49
|
You can pass multiple texts to the same pipeline, which will be processed and passed through the model together as a batch.
|
pipelineáá
áşááŻáááşá¸áááŻáˇ á
áŹááŹá¸ááťáŹá¸á
á˝áŹ ááąá¸áááŻáˇáááŻááşááźáŽá¸ áááşá¸áááŻáˇááᯠáĄá
áŻáááŻááş á
áŽááśááąáŹááşáá˝ááşááźáŽá¸ ááąáŹáşáááşáážáá
áşáááˇáş ááąá¸áááŻáˇááŤáááşá
|
50
|
The output is a list of individual results in the same order as the input texts.
|
áĄáá˝ááşááááşáááş áááˇáşáá˝ááşá¸á
áŹááŹá¸ááťáŹá¸áážááˇáş áá°ááŽááąáŹ áĄá
áŽáĄá
ááşáĄáááŻááşá¸ áá
áşáŚá¸ááťááşá¸ááááşááťáŹá¸á á
áŹáááşá¸ááźá
áşáááşá
|
51
|
Here we find the same label and score for the first text, and the second text is judged negative with a confidence of 99.9%.
|
á¤áá˝ááş áááá
áŹááŹá¸áĄáá˝ááş áá°ááŽááąáŹ label áážááˇáş score ááᯠáá˝áąáˇáááźáŽá¸ ááŻáááá
áŹááŹá¸ááᯠáá.á% ááŻáśááźááşáážáŻááźááˇáş áĄááŻááşáááąáŹááąáŹááşáááşáᯠááŻáśá¸ááźááşáá˛áˇáááşá
|
52
|
The zero-shot classification pipeline is a more general text-classification pipeline, it allows you to provide the labels you want.
|
zero-shot classification pipeline áááş áááŻáááŻááąááŻááťááťááąáŹ á
áŹááŹá¸áá˝á˛ááźáŹá¸áááşáážááşááźááşá¸ pipelineáá
áşááŻááźá
áşááźáŽá¸ áááşáááŻááťááşááąáŹ label ááťáŹá¸ááᯠááąá¸áááŻáˇáááŻááşáááş áá˝ááˇáşááźáŻáááşá
|
53
|
Here we want to classify our input text along the labels, education, politics, and business.
|
á¤áá˝ááş ááťá˝ááşáŻááşáááŻáˇáááş áááˇáşáá˝ááşá¸á
áŹááŹá¸ááᯠáááŹááąá¸á áááŻááşááśááąá¸áážááˇáş á
áŽá¸áá˝áŹá¸ááąá¸ á
áááˇáş label ááťáŹá¸áĄáááŻááşá¸ áá˝á˛ááźáŹá¸áááşáážááşáááŻáááşá
|
54
|
The pipeline successfully recognizes it's more about education than the other labels, with a confidence of 84%.
|
áá% ááŻáśááźááşáážáŻááźááˇáş áááşá¸áááş áĄááźáŹá¸ááąáŹ label ááťáŹá¸áááş áááŹááąá¸áážááˇáş áááŻáááŻáááşáááşááźáąáŹááşá¸ááᯠpipelineá áĄáąáŹááşááźááşá
á˝áŹ áááážááááŻááşáá˛áˇáááşá
|
55
|
Moving on to other tasks, the text generation pipeline will auto-complete a given prompt.
|
áĄááźáŹá¸ááŻááşáááşá¸ááťáŹá¸ááŽáááŻáˇ áááşáá˝áŹá¸ááŤá á
áŹááŹá¸ááŻááşááŻááşáážáŻ pipelineáááş ááąá¸ááŹá¸ááąáŹ á
áŹááŹá¸ááᯠáĄáááŻáĄááťáąáŹááş ááźááˇáşá
á˝ááşááąá¸ááŤááááˇáşáááşá
|
56
|
The output is generated with a bit of randomness, so it changes each time you call the generator object on a given prompt.
|
áĄáá˝ááşááááşáááş áĄáááşá¸áááş ááťáááşá¸ááźá
áşáážáŻááźááˇáş ááŻááşááŻááşááŹá¸ááąáŹááźáąáŹááˇáş ááąá¸ááŹá¸ááąáŹ á
áŹááŹá¸áá
áşááŻáĄáá˝ááş generator ááᯠááąáŤáşáááŻáááˇáş áĄááźáááşáááŻááşá¸ ááźáąáŹááşá¸áá˛áá˝áŹá¸áááşá
|
57
|
Up until now, we've used the the pipeline API with the default model associated to each task, but you can use it with any model that has been pretrained or fine-tuned on this task.
|
áááŻáĄááťáááşáĄáá ááťá˝ááşáŻááşáááŻáˇáááş ááŻááşáááşá¸áá
áşááŻá
áŽáážááˇáş áááşáááŻááşááąáŹ áá°áááąáŹáşáááşááźááˇáş pipeline API ááᯠáĄááŻáśá¸ááźáŻáá˛áˇááąáŹáşáááşá¸ á¤ááŻááşáááşá¸áĄáá˝ááş ááźááŻáááşááąáˇááťááˇáşááŹá¸ááąáŹ áááŻáˇáááŻááş fine-tune ááŻááşááŹá¸ááąáŹ áááşáááˇáşááąáŹáşáááşáážááˇáşáááᯠáááşá¸ááᯠáĄááŻáśá¸ááźáŻáááŻááşáááşá
|
58
|
Going on the model hub, huggingface.co/models you can filter the available models by task.
|
huggingface.co/models áážá model hub áááŻáˇ áááşááąáŹááşááźáŽá¸ ááážááááŻááşááąáŹ ááąáŹáşáááşááťáŹá¸ááᯠááŻááşáááşá¸áĄáááŻááş á
á
áşááŻááşáááŻááşáááşá
|
59
|
The default model used in our previous example was gpt2, but there are many more models available, and not just in English.
|
ááťá˝ááşáŻááşáááŻáˇá ááááşáĽáááŹáá˝ááş áĄááŻáśá¸ááźáŻáá˛áˇááąáŹ áá°áááąáŹáşáááşáážáŹ gpt2 ááźá
áşááąáŹáşáááşá¸ áĄááşášáááááşááŹááŹáĄááźááş áĄááźáŹá¸ááąáŹáşáááşááťáŹá¸á
á˝áŹáááŻáááşá¸ ááážááááŻááşáááşá
|
60
|
Let's go back to the text generation pipeline and load it with another model, distilgpt2.
|
á
áŹááŹá¸ááŻááşááŻááşáážáŻ pipelineáááŻáˇ ááźááşáá˝áŹá¸ááźáŽá¸ distilgpt2 áá°ááąáŹ áĄááźáŹá¸ááąáŹáşáááşááźááˇáş áááşááźááˇáşááĄáąáŹááşá
|
61
|
This is a lighter version of gpt2 created by the Hugging Face team.
|
áááşá¸áááş Hugging Face áĄáá˝á˛áˇááž áááşááŽá¸ááŹá¸ááąáŹ gpt2 á áááŻáááŻááąáŤáˇááŤá¸áááˇáş ááŹá¸áážááşá¸áá
áşáᯠááźá
áşáááşá
|
62
|
When applying the pipeline to a given prompt, we can specify several arguments such as the maximum length of the generated texts, or the number of sentences we want to return, since there is some randomness in the generation.
|
pipelineááᯠááąá¸ááŹá¸ááąáŹ á
áŹááŹá¸áá
áşááŻááąáŤáşáá˝ááş áĄááŻáśá¸ááťáááˇáşáĄá፠ááŻááşááŻááşáááˇáş á
áŹááŹá¸ááťáŹá¸á áĄááťáŹá¸ááŻáśá¸áĄáážááş áááŻáˇáááŻááş ááźááşáááŻááťááşááąáŹ á
áŹááźáąáŹááşá¸áĄááąáĄáá˝ááş á
áááˇáş áááˇáşáááşááťááşááťáŹá¸á
á˝áŹááᯠáááşáážááşáááŻááşáááşá áĄáááşááźáąáŹááˇáşáááŻááąáŹáş ááŻááşááŻááşáážáŻáá˝ááş ááťáááşá¸ááźá
áşáážáŻáĄááťááŻáˇ ááŤáááşááąáŹááźáąáŹááˇáş ááźá
áşáááşá
|
63
|
Generating texts by guessing the next word in a sentence was the pretraining objective of GPT-2.
|
ááŤááťáá
áşááźáąáŹááşá¸áážá ááąáŹááşááŹáááˇáş á
ááŹá¸ááŻáśá¸ááᯠáááˇáşáážááşá¸ááźááşá¸ááźááˇáş á
áŹááŹá¸ááťáŹá¸ ááŻááşááŻááşááźááşá¸áááş GPT-2 á ááźááŻáááşááąáˇááťááˇáşáážáŻ áááşáážááşá¸ááťááşááźá
áşáááşá
|
64
|
The fill mask pipeline is the pretraining objective of BERT, which is to guess the value of masked word.
|
fill mask pipeline áááş ááŻáśá¸áá˝ááşááŹá¸ááąáŹ á
ááŹá¸ááŻáśá¸á áááşáááŻá¸ááᯠáááˇáşáážááşá¸áááşááźá
áşáááˇáş BERT á ááźááŻáááşááąáˇááťááˇáşáážáŻ áááşáážááşá¸ááťááşááźá
áşáááşá
|
65
|
In this case, we ask the two most likely values for the missing words, according to the model, and get mathematical or computational as possible answers.
|
á¤ááá
ášá
áá˝ááşá ááťá˝ááşáŻááşáááŻáˇáááş ááťáąáŹááşááŻáśá¸ááąááąáŹ á
ááŹá¸ááŻáśá¸ááťáŹá¸áĄáá˝ááş ááąáŹáşáááşá áĄáááŻáĄá ááźá
áşáááŻááşááźáąáĄááťáŹá¸ááŻáśá¸ áááşáááŻá¸áážá
áşááŻááᯠááąáŹááşá¸áááŻáá˛áˇáᏠáĄááźáąááťáŹá¸áĄááźá
áş mathematical áááŻáˇáááŻááş computational áááŻáˇááᯠááážááá˛áˇáááşá
|
66
|
Another task Transformers model can perform is to classify each word in the sentence instead of the sentence as a whole.
|
Transformer ááąáŹáşáááşááťáŹá¸ ááŻááşááąáŹááşáááŻááşáááˇáş áĄááźáŹá¸ááŻááşáááşá¸áá
áşááŻáážáŹ ááŤááťáá
áşááŻááŻáśá¸áĄá
áŹá¸ ááŤááťáĄáá˝ááşá¸áážá á
ááŹá¸ááŻáśá¸áá
áşááŻáśá¸ááťááşá¸á
áŽááᯠáá˝á˛ááźáŹá¸áááşáážááşááźááşá¸ ááźá
áşáááşá
|
67
|
One example of this is Named Entity Recognition, which is the task of identifying entities, such as persons, organizations or locations in a sentence.
|
áááşá¸á áĽáááŹáá
áşááŻáážáŹ Named Entity Recognition ááźá
áşááźáŽá¸ ááŤááťáá
áşááŻáĄáá˝ááşá¸áážá áá°ááŻáášáááŻááşááťáŹá¸á áĄáá˝á˛áˇáĄá
ááşá¸ááťáŹá¸ áááŻáˇáááŻááş áááşááąááŹááťáŹá¸áá˛áˇáááŻáˇááąáŹ áĄáá˝á˛áˇáĄá
ááşá¸ááťáŹá¸ááᯠááąáŹáşááŻááşááźááşá¸ááŻááşáááşá¸ ááźá
áşáááşá
|
68
|
Here, the model correctly finds the person, Sylvain, the organization, Hugging Face, as well as the location, Brooklyn, inside the input text.
|
á¤áá˝ááşá ááąáŹáşáááşáááş áááˇáşáá˝ááşá¸á
áŹááŹá¸áĄáá˝ááşá¸áážá ááŻáášáááŻááş Sylvainá áĄáá˝á˛áˇáĄá
ááşá¸ Hugging Face áážááˇáş áááşááąáᏠBrooklyn áááŻáˇááᯠáážááşáááşá
á˝áŹ áážáŹáá˝áąáá˝áąáˇáážááá˛áˇáááşá
|
69
|
The grouped_entities=True argument used is to make the pipeline group together the different words linked to the same entity, such as Hugging and Face here.
|
áĄááŻáśá¸ááźáŻááŹá¸ááąáŹ grouped_entities=True argument áááş Hugging áážááˇáş Face áá˛áˇáááŻáˇ áá°ááŽááąáŹáĄáá˝á˛áˇáĄá
ááşá¸áážááˇáş ááťáááşáááşááŹá¸áááˇáş á
ááŹá¸ááŻáśá¸áĄááťááŻá¸ááťááŻá¸ááᯠpipelineáá
áşááŻáááşá¸áĄááźá
áş á
áŻá
ááşá¸ááąá¸áááş ááźá
áşáááşá
|
70
|
Another task available with the pipeline API is extractive question answering.
|
pipeline API ááźááˇáş ááážááááŻááşááąáŹ áĄááźáŹá¸ááŻááşáááşá¸áá
áşááŻáážáŹ ááąá¸áá˝ááşá¸ááŻááşáá°ááźáąááźáŹá¸ááźááşá¸ ááźá
áşáááşá
|
71
|
Providing a context and a question, the model will identify the span of text in the context containing the answer to the question.
|
áĄááźáąáŹááşá¸áĄááŹáážááˇáş ááąá¸áá˝ááşá¸áá
áşááŻááᯠááąá¸áááŻáˇáááŻááşááŤá ááąáŹáşáááşáááş ááąá¸áá˝ááşá¸á áĄááźáąááŤáážáááąáŹ áĄááźáąáŹááşá¸áĄááŹáĄáá˝ááşá¸áážá á
áŹááŹá¸áĄáááŻááşá¸áĄá
ááᯠááąáŹáşááŻááşááąá¸ááŤááááˇáşáááşá
|
72
|
Getting short summaries of very long articles is also something the Transformers library can help with, with the summarization pipeline.
|
áĄáá˝ááşáážááşááťáŹá¸ááąáŹ ááąáŹááşá¸ááŤá¸ááťáŹá¸ááᯠáĄáááŻááťáŻáśá¸ááťáŻááşáááŻá¸ááťáŻááşááźááşá¸áááşáááşá¸ summarization pipeline ááźááˇáş Transformers library á áá°ááŽááąá¸áááŻááşááąáŹ ááŻááşáááşá¸áá
áşááŻááźá
áşáááşá
|
73
|
Finally, the last task supported by the pipeline API is translation.
|
ááąáŹááşááŻáśá¸áĄááąáá˛áˇ pipeline API á ááśáˇáááŻá¸ááąá¸áá˛áˇ ááąáŹááşááŻáśá¸ááŻááşáááşá¸áááąáŹáˇ ááŹááŹááźááşááźááşá¸ ááźá
áşáááşá
|
74
|
Here we use a French/English model found on the model hub to get the English version of our input text.
|
á¤áá˝ááş ááťá˝ááşáŻááşáááŻáˇá áááˇáşáá˝ááşá¸á
áŹááŹá¸ááᯠáĄááşášáááááşááŹááŹáááŻáˇ ááźáąáŹááşá¸áá˛áááşáĄáá˝ááş model hub áá˝ááş áá˝áąáˇáážááááąáŹ ááźááşáá
áş/áĄááşášáááááş ááąáŹáşáááşááᯠáĄááŻáśá¸ááźáŻááŹá¸áááşá
|
75
|
Here is a brief summary of all the tasks we've looked into in this video.
|
á¤ááŽááŽáááŻáá˝ááş ááťá˝ááşáŻááşáááŻáˇ ááąáˇááŹáá˛áˇáááˇáş ááŻááşáááşá¸ááťáŹá¸áĄáŹá¸ááŻáśá¸ááᯠáĄááťááşá¸ááťáŻááş ááąáŹáşááźááŹá¸áááşá
|
76
|
Try then out through the inference widgets in the model hub.
|
áááŻááŻááşáááşá¸ááťáŹá¸ááᯠmodel hub áážá inference widgets ááťáŹá¸áážáá
áşáááˇáş á
ááşá¸áááşááźááˇáşááŤá
|
77
|
So let's talk about the carbon footprint of transformers.
|
Transformer ááťáŹá¸á ááŹáá˝ááşááźáąááŹáĄááźáąáŹááşá¸ ááźáąáŹááźááˇáşááĄáąáŹááşá
|
78
|
Maybe you've seen headlines such as this one that training a single AI model can emit as much carbon as five cars in their lifetimes.
|
AI ááąáŹáşáááşáá
áşááŻáááşá¸ááᯠááąáˇááťááˇáşááąá¸ááźááşá¸á ááŹá¸ááŤá¸á
áŽá¸á áááşáááşá¸áá
áşááťážáąáŹááş ááŻááşáá˝ážááşááąáŹ ááŹáá˝ááşáááŹááážááˇáş ááŽááťážáááŻááşáááşáá°ááąáŹ ááąáŤááşá¸á
áŽá¸ááťááŻá¸ áááşáá˝áąáˇáá°á¸ááąáááşá
|
79
|
So when is this true and is it always true?
|
ááŤááᯠááŤá áááşáĄááťáááşáážáŹ áážááşááá˛á áĄááźá˛áááşá¸ áážááşáááŹá¸á
|
80
|
Well, it actually depends on several things.
|
ááááşááąáŹáˇ ááŤá áĄááťááşááťáŹá¸á
á˝áŹááąáŤáş áá°áááşáááşá
|
81
|
Most importantly, it depends on the type of energy you're using.
|
áĄááąá¸áĄááźáŽá¸ááŻáśá¸áááąáŹáˇ áááşáĄááŻáśá¸ááźáŻááąáááˇáş á
á˝ááşá¸áĄááşáĄááťááŻá¸áĄá
áŹá¸ááąáŤáş áá°áááşáááşá
|
82
|
If you're using renewable energy such as solar, wind, hydroelectricity, you're really not emitting any carbon at all, very, very little.
|
ááąááąáŹááşááźááşá ááąáĄáŹá¸á ááąáĄáŹá¸ááťážááşá
á
áş á
áááˇáş ááźááşáááşááźááˇáşááźááŻá¸ááźá˛á
á˝ááşá¸áĄááşááᯠáĄááŻáśá¸ááźáŻááąááŤá ááŹáá˝ááş ááŻáśá¸áááŽá¸ááŤá¸ áááŻáˇáááŻááş áĄáááşá¸áááşáᏠááŻááşáá˝ážááşáááşá
|
83
|
If you're using non-renewable energy sources such as coal then their carbon footprint is a lot higher 'cuz essentially you are emitting a lot of greenhouse gases.
|
ááťáąáŹááşááŽá¸áá˝áąá¸áá˛áˇáááŻáˇ ááźááşáááşáááźááˇáşááźááŻá¸áááŻááşááąáŹ á
á˝ááşá¸áĄááşáĄáááşá¸áĄááźá
áşááťáŹá¸ááᯠáĄááŻáśá¸ááźáŻááŤá áááşááŻáśáĄáááşááŹááşáá˝áąáˇááťáŹá¸á
á˝áŹááᯠááŻááşáá˝ážááşááąááąáŹááźáąáŹááˇáş áááşá¸áááŻáˇá ááŹáá˝ááşááźáąááŹáááş ááťáŹá¸á
á˝áŹ áááŻááźááˇáşáááşá
|
84
|
Another aspect is training time.
|
ááąáŹááşáá
áşááťááşáááąáŹáˇ ááąáˇááťááˇáşááťáááş ááźá
áşáááşá
|
85
|
So the longer you train, the more energy you use the more energy you use, the more carbon you emit, right?
|
ááŤááźáąáŹááˇáş áááşááźáŹááźáŹ ááąáˇááťááˇáşááąááąá á
á˝ááşá¸áĄááşáááŻááŻáśá¸ááąááąá ááŹáá˝ááşáááŻááŻááşáá˝ážááşááąááą ááźá
áşáááşá
|
86
|
So this really adds up especially if you're training large models for for hours and days and weeks.
|
ááŤááźáąáŹááˇáş ááŤáá˝áąá áĄáá°á¸áááźááˇáş ááźáŽá¸ááŹá¸áá˛áˇ ááąáŹáşáááşáá˝áąááᯠááŹááŽááťáŹá¸á
á˝áŹá áááşááąáŤááşá¸ááťáŹá¸á
á˝áŹáá˛áˇ áááşááášááááşáá˝áąáĄáá ááąáˇááťááˇáşáá˛áˇáĄá፠ááááşááᯠá
áŻááŻáśááŹáááŻááşáááşá
|
87
|
The hardware you use also matters because some GPUs, for example, are more efficient than others and utilizing efficiency use properly.
|
áááşáĄááŻáśá¸ááźáŻáááˇáş ááŹáˇááşáá˛áááşá¸ áĄááąá¸ááźáŽá¸áááşá áĄáááşááźáąáŹááˇáşáááŻááąáŹáş áĽáááŹáĄáŹá¸ááźááˇáş GPU áĄááťááŻáˇáᏠáááźáŹá¸ GPU áá˝áąáááş áááŻáááŻá
á˝ááşá¸ááąáŹááşáááşááźááˇáşááŹá¸ááźáŽá¸ á
á˝ááşá¸ááąáŹááşáááşááᯠááąáŹááşá¸á
á˝áŹáĄááŻáśá¸ááťáááŻááşááąáŹááźáąáŹááˇáş ááźá
áşáááşá
|
88
|
So using them a hundred percent all the time can really reduce the energy consumption that you have.
|
ááŤááźáąáŹááˇáş áááşá¸áááŻáˇááᯠáĄááťáááşááźááˇáş ááá ááŹáááŻááşáážáŻááşá¸ áĄááŻáśá¸ááźáŻááźááşá¸á áááˇáşáá˛áˇ á
á˝ááşá¸áĄááşááŻáśá¸á
á˝á˛áážáŻááᯠááááşáᲠááťážáąáŹáˇááťáááŻááşáááşá
|
89
|
And then once again, reduce your carbon footprint.
|
áĄá˛ááŽáĄááŤáážáŹ áááˇáşáá˛áˇ ááŹáá˝ááşááźáąááŹááᯠáááşááśááťážáąáŹáˇááťáááŻááşáážáŹ ááźá
áşáááşá
|
90
|
There's also other aspects such as IO such as data, et cetera, et cetera.
|
IO áážááˇáş ááąáᏠá
áááˇáş áĄááźáŹá¸ááášáááťáŹá¸áááşá¸ áážáááŤááąá¸áááşá
|
91
|
But these are the main three that you should focus on.
|
áááŻáˇááąáŹáş á¤ááŻáśá¸ááťááşáááąáŹáˇ áááşáĄááá áĄáŹááŻáśá
ááŻááşáááˇáşáááˇáş áĄááťááşááťáŹá¸ ááźá
áşáááşá
|
92
|
So when I talk about energy sources and carbon intensity what does that really mean?
|
ááŤááᯠááťá˝ááşááąáŹáş á
á˝ááşá¸áĄááşáĄáááşá¸áĄááźá
áşáá˝áąáá˛áˇ ááŹáá˝ááşááźááşá¸áĄáŹá¸áĄááźáąáŹááşá¸ ááźáąáŹáá˛áˇáĄá፠áĄá˛ááŤá ááááşááąáŹáˇ ááŹáááŻáááŻáááŻááá˛á
|
93
|
So if you look at the top of the screen you have a carbon footprint of a cloud computing instance in Mumbai, India which emits 920 grams of CO2 per kilowatt hour.
|
ááťááşáážáŹááźááşáá˛áˇ ááááşáážáŹ ááźááˇáşáááşáááŻáááş áĄááášááááááŻááşááś áá˝ááşáááŻááşá¸áážá cloud computing instance áá
áşááŻáá˛áˇ ááŹáá˝ááşááźáąááŹááᯠáá˝áąáˇááážáŹááŤá áááşá¸áááş áá
áşááŹáᎠááŽáááŻáááşááťážááş CO2 ááá ááááş ááŻááşáá˝ážááşáááşá
|
94
|
This is almost one kilogram of CO2 per kilowatt hour of electricity used.
|
ááŤá áĄááŻáśá¸ááźáŻáá˛áˇ ááťážááşá
á
áşááŹááşáĄáŹá¸ áá
áşááŹáᎠááŽáááŻáááşááťážááş CO2 áá
áşááŽáááŻááááş ááŽá¸ááŤá¸áážááááşá
|
95
|
If you compare that with Canada, Montreal where I am right now, 20 grams of CO2 per kilo hour.
|
ááŤááᯠááťá˝ááşááąáŹáş ááŻááąáŹááşááąáá˛áˇ áááąááŤáááŻááşááśá Montreal áá˛áˇ áážááŻááşá¸áážááşááźááˇáşáááş áá
áşááŹáᎠááŽáááŻáááşááťážááş CO2 áá ááááşáᲠáážááááşá
|
96
|
So that's a really, really big difference.
|
ááŤáᏠááááşááᯠááźáŽá¸ááŹá¸áá˛áˇ áá˝áŹááźáŹá¸ááťááşáá
áşáᯠááźá
áşáááşá
|
97
|
Almost more than 40 times more carbon emitted in Mumbai versus Montreal.
|
Montreal áá˛áˇáážááşáááş áá˝ááşáááŻááşá¸áážáŹ ááŹáá˝ááşááŻááşáá˝ážááşáážáŻá áĄá áá ááťáąáŹáş áááŻááťáŹá¸áááşá
|
98
|
And so this can really, really add up.
|
ááŤááźáąáŹááˇáş ááŤáá˝áąá ááááşáᲠá
áŻááŻáśááŹáááŻááşáááşá
|
99
|
If you're training a model for weeks, for example you're multiplying times 40 the carbon that you're emitting.
|
áĽáááŹáĄáŹá¸ááźááˇáş áááşá ááąáŹáşáááşáá
áşááŻááᯠáááşááášááááşáá˝áąáĄáá ááąáˇááťááˇáşááąáááşáááŻáááş áááşááŻááşáá˝ážááşáá˛áˇ ááŹáá˝ááşááᯠáĄá áá áá˛áˇ ááźážáąáŹááşááąááŹáážááˇáş áá°áááşá
|
100
|
So choosing the right instance choosing a low carbon compute instance is really the most impactful thing that you can do.
|
ááŤááźáąáŹááˇáş áááˇáşááąáŹáşáá˛áˇ instance ááᯠáá˝áąá¸ááťááşááŹá ááŹáá˝ááşáááşá¸áá˛áˇ compute instance ááᯠáá˝áąá¸ááťááşááŹáᏠáááşááŻááşááąáŹááşáááŻááşáá˛áˇ áĄááťááŻá¸áááşááąáŹááşáážáŻ áĄáážáááŻáśá¸áĄáᏠááźá
áşáááşá
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.