Merge pull request #665 from jasleen101010/main
Fixed hyperlinks and knitted the Rmd files to html for beginning half of the repo
This commit is contained in:
Коммит
1cad50dca5
|
@ -355,3 +355,9 @@ MigrationBackup/
|
|||
|
||||
# Mac-specific
|
||||
.DS_Store
|
||||
.Rproj.user
|
||||
|
||||
.Rdata
|
||||
.Rhistory
|
||||
ML-For-Beginners.Rproj
|
||||
|
||||
|
|
|
@ -6,7 +6,7 @@
|
|||
|
||||
## [Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/9/)
|
||||
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_1-R.ipynb)
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_1.html)
|
||||
|
||||
## Introduction
|
||||
|
||||
|
|
Различия файлов скрыты, потому что одна или несколько строк слишком длинны
|
@ -6,7 +6,7 @@ Infographic by [Dasani Madipalli](https://twitter.com/dasani_decoded)
|
|||
|
||||
## [Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/11/)
|
||||
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_2-R.ipynb)
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_2.html)
|
||||
|
||||
## Introduction
|
||||
|
||||
|
|
Различия файлов скрыты, потому что одна или несколько строк слишком длинны
|
@ -4,7 +4,7 @@
|
|||
> Infographic by [Dasani Madipalli](https://twitter.com/dasani_decoded)
|
||||
## [Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/13/)
|
||||
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_3-R.ipynb)
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_3.html)
|
||||
### Introduction
|
||||
|
||||
So far you have explored what regression is with sample data gathered from the pumpkin pricing dataset that we will use throughout this lesson. You have also visualized it using Matplotlib.
|
||||
|
|
|
@ -84,8 +84,8 @@ We do so since we want to model a line that has the least cumulative distance fr
|
|||
>
|
||||
> In other words, and referring to our pumpkin data's original question: "predict the price of a pumpkin per bushel by month", `X` would refer to the price and `Y` would refer to the month of sale.
|
||||
>
|
||||
> ![Infographic by Jen Looper](../../images/calculation.png)
|
||||
>
|
||||
![Infographic by Jen Looper](../../images/calculation.png)
|
||||
|
||||
> Calculate the value of Y. If you're paying around \$4, it must be April!
|
||||
>
|
||||
> The math that calculates the line must demonstrate the slope of the line, which is also dependent on the intercept, or where `Y` is situated when `X = 0`.
|
||||
|
@ -114,7 +114,7 @@ Load up required libraries and dataset. Convert the data to a data frame contain
|
|||
|
||||
- Convert the price to reflect the pricing by bushel quantity
|
||||
|
||||
> We covered these steps in the [previous lesson](https://github.com/microsoft/ML-For-Beginners/blob/main/2-Regression/2-Data/solution/lesson_2-R.ipynb).
|
||||
> We covered these steps in the [previous lesson](https://github.com/microsoft/ML-For-Beginners/blob/main/2-Regression/2-Data/solution/lesson_2.html).
|
||||
|
||||
```{r load_tidy_verse_models, message=F, warning=F}
|
||||
# Load the core Tidyverse packages
|
||||
|
@ -285,7 +285,7 @@ That's an awesome thought! You see, once your recipe is defined, you can estimat
|
|||
|
||||
For that, you'll need two more verbs: `prep()` and `bake()` and as always, our little R friends by [`Allison Horst`](https://github.com/allisonhorst/stats-illustrations) help you in understanding this better!
|
||||
|
||||
![Artwork by \@allison_horst](../images/recipes.png){width="550"}
|
||||
![Artwork by \@allison_horst](../../images/recipes.png){width="550"}
|
||||
|
||||
[`prep()`](https://recipes.tidymodels.org/reference/prep.html): estimates the required parameters from a training set that can be later applied to other data sets. For instance, for a given predictor column, what observation will be assigned integer 0 or 1 or 2 etc
|
||||
|
||||
|
|
Различия файлов скрыты, потому что одна или несколько строк слишком длинны
|
@ -4,7 +4,7 @@
|
|||
|
||||
## [Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/15/)
|
||||
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_4-R.ipynb)
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_4.html)
|
||||
|
||||
## Introduction
|
||||
|
||||
|
|
|
@ -14,7 +14,7 @@ output:
|
|||
|
||||
![Infographic by Dasani Madipalli](../../images/logistic-linear.png){width="600"}
|
||||
|
||||
#### ** [Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/15/)**
|
||||
#### **[Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/15/)**
|
||||
|
||||
#### Introduction
|
||||
|
||||
|
@ -56,7 +56,7 @@ suppressWarnings(if (!require("pacman"))install.packages("pacman"))
|
|||
pacman::p_load(tidyverse, tidymodels, janitor, ggbeeswarm)
|
||||
```
|
||||
|
||||
## ** Define the question**
|
||||
## **Define the question**
|
||||
|
||||
For our purposes, we will express this as a binary: 'Orange' or 'Not Orange'. There is also a 'striped' category in our dataset but there are few instances of it, so we will not use it. It disappears once we remove null values from the dataset, anyway.
|
||||
|
||||
|
@ -148,7 +148,7 @@ pumpkins_select %>%
|
|||
|
||||
The goal of data exploration is to try to understand the `relationships` between its attributes; in particular, any apparent correlation between the *features* and the *label* your model will try to predict. One way of doing this is by using data visualization.
|
||||
|
||||
Given our the data types of our columns, we can `encode` them and be on our way to making some visualizations. This simply involves `translating` a column with `categorical values` for example our columns of type *char*, into one or more `numeric columns` that take the place of the original. - Something we did in our [last lesson](https://github.com/microsoft/ML-For-Beginners/blob/main/2-Regression/3-Linear/solution/lesson_3-R.ipynb).
|
||||
Given our the data types of our columns, we can `encode` them and be on our way to making some visualizations. This simply involves `translating` a column with `categorical values` for example our columns of type *char*, into one or more `numeric columns` that take the place of the original. - Something we did in our [last lesson](https://github.com/microsoft/ML-For-Beginners/blob/main/2-Regression/3-Linear/solution/lesson_3.html).
|
||||
|
||||
Tidymodels provides yet another neat package: [recipes](https://recipes.tidymodels.org/)- a package for preprocessing data. We'll define a `recipe` that specifies that all predictor columns should be encoded into a set of integers , `prep` it to estimates the required quantities and statistics needed by any operations and finally `bake` to apply the computations to new data.
|
||||
|
||||
|
|
Различия файлов скрыты, потому что одна или несколько строк слишком длинны
|
@ -21,7 +21,7 @@ Classification uses various algorithms to determine other ways of determining a
|
|||
|
||||
## [Pre-lecture quiz](https://gray-sand-07a10f403.1.azurestaticapps.net/quiz/19/)
|
||||
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_10-R.ipynb)
|
||||
> ### [This lesson is available in R!](./solution/R/lesson_10.html)
|
||||
|
||||
### Introduction
|
||||
|
||||
|
|
|
@ -393,7 +393,7 @@ Let's now save a copy of this data for use in future lessons:
|
|||
|
||||
```{r save_preproc_data}
|
||||
# Save preprocessed data
|
||||
write_csv(preprocessed_df, "../../data/cleaned_cuisines_R.csv")
|
||||
write_csv(preprocessed_df, "../../../data/cleaned_cuisines_R.csv")
|
||||
|
||||
```
|
||||
|
||||
|
|
Различия файлов скрыты, потому что одна или несколько строк слишком длинны
Разница между файлами не показана из-за своего большого размера
Загрузить разницу
10
README.md
10
README.md
|
@ -94,12 +94,12 @@ By ensuring that the content aligns with projects, the process is made more enga
|
|||
| 02 | The History of machine learning | [Introduction](1-Introduction/README.md) | Learn the history underlying this field | [Lesson](1-Introduction/2-history-of-ML/README.md) | Jen and Amy |
|
||||
| 03 | Fairness and machine learning | [Introduction](1-Introduction/README.md) | What are the important philosophical issues around fairness that students should consider when building and applying ML models? | [Lesson](1-Introduction/3-fairness/README.md) | Tomomi |
|
||||
| 04 | Techniques for machine learning | [Introduction](1-Introduction/README.md) | What techniques do ML researchers use to build ML models? | [Lesson](1-Introduction/4-techniques-of-ML/README.md) | Chris and Jen |
|
||||
| 05 | Introduction to regression | [Regression](2-Regression/README.md) | Get started with Python and Scikit-learn for regression models | <ul><li>[Python](2-Regression/1-Tools/README.md)</li><li>[R](2-Regression/1-Tools/solution/R/lesson_1-R.ipynb)</li></ul> | <ul><li>Jen</li><li>Eric Wanjau</li></ul> |
|
||||
| 06 | North American pumpkin prices 🎃 | [Regression](2-Regression/README.md) | Visualize and clean data in preparation for ML | <ul><li>[Python](2-Regression/2-Data/README.md)</li><li>[R](2-Regression/2-Data/solution/R/lesson_2-R.ipynb)</li></ul> | <ul><li>Jen</li><li>Eric Wanjau</li></ul> |
|
||||
| 07 | North American pumpkin prices 🎃 | [Regression](2-Regression/README.md) | Build linear and polynomial regression models | <ul><li>[Python](2-Regression/3-Linear/README.md)</li><li>[R](2-Regression/3-Linear/solution/R/lesson_3-R.ipynb)</li></ul> | <ul><li>Jen and Dmitry</li><li>Eric Wanjau</li></ul> |
|
||||
| 08 | North American pumpkin prices 🎃 | [Regression](2-Regression/README.md) | Build a logistic regression model | <ul><li>[Python](2-Regression/4-Logistic/README.md) </li><li>[R](2-Regression/4-Logistic/solution/R/lesson_4-R.ipynb)</li></ul> | <ul><li>Jen</li><li>Eric Wanjau</li></ul> |
|
||||
| 05 | Introduction to regression | [Regression](2-Regression/README.md) | Get started with Python and Scikit-learn for regression models | <ul><li>[Python](2-Regression/1-Tools/README.md)</li><li>[R](2-Regression/1-Tools/solution/R/lesson_1.html)</li></ul> | <ul><li>Jen</li><li>Eric Wanjau</li></ul> |
|
||||
| 06 | North American pumpkin prices 🎃 | [Regression](2-Regression/README.md) | Visualize and clean data in preparation for ML | <ul><li>[Python](2-Regression/2-Data/README.md)</li><li>[R](2-Regression/2-Data/solution/R/lesson_2.html)</li></ul> | <ul><li>Jen</li><li>Eric Wanjau</li></ul> |
|
||||
| 07 | North American pumpkin prices 🎃 | [Regression](2-Regression/README.md) | Build linear and polynomial regression models | <ul><li>[Python](2-Regression/3-Linear/README.md)</li><li>[R](2-Regression/3-Linear/solution/R/lesson_3.html)</li></ul> | <ul><li>Jen and Dmitry</li><li>Eric Wanjau</li></ul> |
|
||||
| 08 | North American pumpkin prices 🎃 | [Regression](2-Regression/README.md) | Build a logistic regression model | <ul><li>[Python](2-Regression/4-Logistic/README.md) </li><li>[R](2-Regression/4-Logistic/solution/R/lesson_4.html)</li></ul> | <ul><li>Jen</li><li>Eric Wanjau</li></ul> |
|
||||
| 09 | A Web App 🔌 | [Web App](3-Web-App/README.md) | Build a web app to use your trained model | [Python](3-Web-App/1-Web-App/README.md) | Jen |
|
||||
| 10 | Introduction to classification | [Classification](4-Classification/README.md) | Clean, prep, and visualize your data; introduction to classification | <ul><li> [Python](4-Classification/1-Introduction/README.md) </li><li>[R](4-Classification/1-Introduction/solution/R/lesson_10-R.ipynb) | <ul><li>Jen and Cassie</li><li>Eric Wanjau</li></ul> |
|
||||
| 10 | Introduction to classification | [Classification](4-Classification/README.md) | Clean, prep, and visualize your data; introduction to classification | <ul><li> [Python](4-Classification/1-Introduction/README.md) </li><li>[R](4-Classification/1-Introduction/solution/R/lesson_10.html) | <ul><li>Jen and Cassie</li><li>Eric Wanjau</li></ul> |
|
||||
| 11 | Delicious Asian and Indian cuisines 🍜 | [Classification](4-Classification/README.md) | Introduction to classifiers | <ul><li> [Python](4-Classification/2-Classifiers-1/README.md)</li><li>[R](4-Classification/2-Classifiers-1/solution/R/lesson_11.html) | <ul><li>Jen and Cassie</li><li>Eric Wanjau</li></ul> |
|
||||
| 12 | Delicious Asian and Indian cuisines 🍜 | [Classification](4-Classification/README.md) | More classifiers | <ul><li> [Python](4-Classification/3-Classifiers-2/README.md)</li><li>[R](4-Classification/3-Classifiers-2/solution/R/lesson_12.html) | <ul><li>Jen and Cassie</li><li>Eric Wanjau</li></ul> |
|
||||
| 13 | Delicious Asian and Indian cuisines 🍜 | [Classification](4-Classification/README.md) | Build a recommender web app using your model | [Python](4-Classification/4-Applied/README.md) | Jen |
|
||||
|
|
Загрузка…
Ссылка в новой задаче