Merge commit '44a70199c4b7e13380c6543533fa1798cb561082' as 'apps/web-bootstrap'

This commit is contained in:
Lauren George 2021-09-21 15:28:56 -07:00
Родитель e5e72405b2 44a70199c4
Коммит 34b204e76b
53 изменённых файлов: 12764 добавлений и 0 удалений

25
apps/web-bootstrap/.gitignore поставляемый Normal file
Просмотреть файл

@ -0,0 +1,25 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
# dependencies
/node_modules
/.pnp
.pnp.js
# testing
/coverage
# production
/build
# misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local
npm-debug.log*
yarn-debug.log*
yarn-error.log*
.idea

Просмотреть файл

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2021 Microsoft
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

Просмотреть файл

@ -0,0 +1,37 @@
![Web Bootstrap header](assets/header.jpg)
[Lobe](http://lobe.ai/) is a free, easy to use app that has everything you need to bring your machine learning ideas to life.
Web Bootstrap takes the machine learning model created in Lobe, and adds it to a project in the browser that uses
[React](https://reactjs.org), [Create React App](https://github.com/facebook/create-react-app), [TypeScript](https://www.typescriptlang.org/), and [TensorFlow.js](https://www.tensorflow.org/js).
## Get Started
1. Clone or download the project on your computer and install [Yarn](https://yarnpkg.com/). Yarn is the software package that will install all the dependencies and make sure the code automatically reloads when changes are made.
2. Run `yarn install` to install required dependencies and run `yarn start` to start the server in development mode. This will open a web browser to
`localhost:3000`. By default, this project is using the TensorFlow.js exported model from Lobe found in the `public/model/` folder.
3. To use your own model file, open your Lobe project, go to the Use tab, select Export, and click on the TensorFlow.js model file.
When exported, drag the `model.json`, `signature.json`, and all the `*.bin` files to the `public/model/` folder.
## Additional Information
Check out the [Create React App documentation](https://create-react-app.dev/docs/getting-started)
for more information on React and the project structure.
There are three main components: Camera, Prediction, and StaticImage.
The Camera, which runs in `components/camera/Camera.tsx` is responsible for displaying a live full screen view of the user's selected webcam.
The Prediction component `components/prediction/Prediction.tsx` is the box in the lower left hand corner, and is responsible for displaying the top prediction results and their confidences.
The StaticImage component `components/staticImage/StaticImage.tsx` displays an image selected from the file picker and runs it through the model from a canvas element.
### Known Issues
TensorFlow.js on Safari may have problems initializing the WebGL backend for acceleration and will fall back to the CPU.
You can use the WebAssembly (wasm) backend as an alternative to WebGL:
https://www.tensorflow.org/js/guide/platform_environment#wasm_backend
## Contributing
GitHub Issues are for reporting bugs, discussing features and general feedback on the Web Bootstrap project. Be sure to check our documentation, FAQ and past issues before opening any new ones.
To share your project, get feedback on it, and learn more about Lobe, please visit our community on [Reddit](https://www.reddit.com/r/Lobe/).
We look forward to seeing the amazing projects that can be built, when machine learning is made accessible to you.

Двоичные данные
apps/web-bootstrap/assets/header.jpg Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 614 KiB

Просмотреть файл

@ -0,0 +1,48 @@
{
"name": "lobe-web-sample",
"version": "0.2.0",
"private": true,
"dependencies": {
"@tensorflow/tfjs": "^3.1.0",
"@testing-library/jest-dom": "^5.11.4",
"@testing-library/react": "^11.1.0",
"@testing-library/user-event": "^12.1.10",
"@types/jest": "^26.0.15",
"@types/node": "^12.0.0",
"@types/react": "^17.0.0",
"@types/react-dom": "^17.0.0",
"react": "^17.0.1",
"react-dom": "^17.0.1",
"react-scripts": "4.0.2",
"react-webcam": "^5.2.3",
"typescript": "^4.1.2",
"web-vitals": "^1.0.1",
"workerize-loader": "^1.3.0"
},
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test",
"eject": "react-scripts eject",
"svgr": "svgr -d src/Icons/ assets/svg/"
},
"eslintConfig": {
"extends": [
"react-app",
"react-app/jest"
]
},
"browserslist": {
"production": [
">0.2%",
"not dead",
"not op_mini all"
],
"development": [
"last 1 chrome version",
"last 1 firefox version",
"last 1 safari version"
]
},
"devDependencies": {}
}

Двоичные данные
apps/web-bootstrap/public/favicon.ico Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 15 KiB

Просмотреть файл

@ -0,0 +1,43 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#000000" />
<meta
name="description"
content="Web site created using create-react-app"
/>
<link rel="apple-touch-icon" href="%PUBLIC_URL%/logo192.png" />
<!--
manifest.json provides metadata used when your web app is installed on a
user's mobile device or desktop. See https://developers.google.com/web/fundamentals/web-app-manifest/
-->
<link rel="manifest" href="%PUBLIC_URL%/manifest.json" />
<!--
Notice the use of %PUBLIC_URL% in the tags above.
It will be replaced with the URL of the `public` folder during the build.
Only files inside the `public` folder can be referenced from the HTML.
Unlike "/favicon.ico" or "favicon.ico", "%PUBLIC_URL%/favicon.ico" will
work correctly both with client-side routing and a non-root public URL.
Learn how to configure a non-root public URL by running `npm run build`.
-->
<title>Lobe Web</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
<!--
This HTML file is a template.
If you open it directly in the browser, you will see an empty page.
You can add webfonts, meta tags, or analytics to this file.
The build step will place the bundled scripts into the <body> tag.
To begin the development, run `npm start` or `yarn start`.
To create a production bundle, use `npm run build` or `yarn build`.
-->
</body>
</html>

Двоичные данные
apps/web-bootstrap/public/logo192.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 5.0 KiB

Двоичные данные
apps/web-bootstrap/public/logo512.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 15 KiB

Просмотреть файл

@ -0,0 +1,25 @@
{
"short_name": "Lobe Web",
"name": "Lobe Web Sample App",
"icons": [
{
"src": "favicon.ico",
"sizes": "64x64 32x32 24x24 16x16",
"type": "image/x-icon"
},
{
"src": "logo192.png",
"type": "image/png",
"sizes": "192x192"
},
{
"src": "logo512.png",
"type": "image/png",
"sizes": "512x512"
}
],
"start_url": ".",
"display": "standalone",
"theme_color": "#000000",
"background_color": "#ffffff"
}

Двоичные данные
apps/web-bootstrap/public/model/group1-shard1of4.bin Normal file

Двоичный файл не отображается.

Двоичные данные
apps/web-bootstrap/public/model/group1-shard2of4.bin Normal file

Двоичный файл не отображается.

Двоичные данные
apps/web-bootstrap/public/model/group1-shard3of4.bin Normal file

Двоичный файл не отображается.

Двоичные данные
apps/web-bootstrap/public/model/group1-shard4of4.bin Normal file

Двоичный файл не отображается.

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Просмотреть файл

@ -0,0 +1 @@
{"doc_id": "62cfdd88-22bb-4601-8fbf-dc96ae2dcd12", "doc_name": "Web Thumbs", "doc_version": "7a8180a232ab91e6f673f6808f47db92", "format": "tf_js", "version": 37, "inputs": {"Image": {"dtype": "float32", "shape": [null, 224, 224, 3], "name": "Image:0"}}, "outputs": {"Confidences": {"dtype": "float32", "shape": [null, 3], "name": "62cfdd88-22bb-4601-8fbf-dc96ae2dcd12.cc3455b9-54e6-4a06-a65d-97205ca63ce4/dense_2/Softmax:0"}}, "tags": null, "classes": {"Label": ["None", "Thumb Down", "Thumb Up"]}, "filename": "model.json", "export_model_version": 1}

Просмотреть файл

@ -0,0 +1,3 @@
# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow:

Просмотреть файл

@ -0,0 +1,16 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="24px" height="24px" viewBox="0 0 24 24" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>check</title>
<g id="Pages" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd">
<g id="Example-Web-App-Sources" transform="translate(-1649.000000, -293.000000)">
<g id="Group" transform="translate(1423.000000, 268.000000)">
<g id="Group-10" transform="translate(24.000000, 25.000000)">
<g id="Icon-/-General-/-Check" transform="translate(202.000000, 0.000000)">
<path d="M11.999485,2 C17.5229414,2 22,6.47785788 22,12 C22,17.523172 17.5229414,22 11.999485,22 C6.47705855,22 2,17.523172 2,12 C2,6.47785788 6.47705855,2 11.999485,2 Z M15.6918031,9.28535193 C15.300515,8.90488269 14.666168,8.90488269 14.2739592,9.28535193 L14.2739592,9.28535193 L10.9346606,12.5323212 L9.72673131,11.3568951 C9.33544322,10.9755306 8.7001755,10.9755306 8.30888742,11.3559999 L8.30888742,11.3559999 L8.29415657,11.3703234 C7.90194781,11.7507926 7.90194781,12.3684957 8.29415657,12.7498601 L8.29415657,12.7498601 L10.3776505,14.775747 C10.6851569,15.074751 11.1841643,15.074751 11.4916707,14.775747 L11.4916707,14.775747 L15.7065339,10.6774218 C16.097822,10.2969525 16.097822,9.68014473 15.7065339,9.29967548 L15.7065339,9.29967548 Z" id="sources-checkmark" fill="#FFFFFF"></path>
<g id="Icon-/-Small-/-Check" transform="translate(12.000000, 9.000000)"></g>
</g>
</g>
</g>
</g>
</g>
</svg>

После

Ширина:  |  Высота:  |  Размер: 1.6 KiB

Просмотреть файл

@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="25px" height="25px" viewBox="0 0 25 25" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>close</title>
<g id="Pages" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd" stroke-linecap="round" stroke-linejoin="round">
<g id="Example-Web-App-Image" transform="translate(-1660.000000, -223.000000)" stroke="#FFFFFF" stroke-width="2">
<g id="Toggle" transform="translate(1650.000000, 213.000000)">
<g id="close" transform="translate(10.000000, 10.000000)">
<line x1="18" y1="7" x2="7" y2="18" id="Path"></line>
<line x1="7" y1="7" x2="18" y2="18" id="Path"></line>
</g>
</g>
</g>
</g>
</svg>

После

Ширина:  |  Высота:  |  Размер: 824 B

Просмотреть файл

@ -0,0 +1,16 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="25px" height="25px" viewBox="0 0 25 25" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>gallery</title>
<g id="Pages" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd">
<g id="Example-Web-App" transform="translate(-315.000000, -223.000000)">
<g id="Toggle" transform="translate(305.000000, 213.000000)">
<g id="Group-6" transform="translate(10.000000, 10.000000)">
<g id="Group-4" transform="translate(5.500000, 5.500000)">
<path d="M5.1277704,1.95404733e-17 L8.8722296,4.33876596e-16 C10.6552671,-2.52094797e-16 11.3018396,0.185651222 11.9536914,0.534265408 C12.6055433,0.882879593 13.1171204,1.39445674 13.4657346,2.04630859 C13.8143488,2.69816044 14,3.34473292 14,5.1277704 L14,8.8722296 C14,10.6552671 13.8143488,11.3018396 13.4657346,11.9536914 C13.1171204,12.6055433 12.6055433,13.1171204 11.9536914,13.4657346 C11.3018396,13.8143488 10.6552671,14 8.8722296,14 L5.1277704,14 C3.34473292,14 2.69816044,13.8143488 2.04630859,13.4657346 C1.39445674,13.1171204 0.882879593,12.6055433 0.534265408,11.9536914 C0.185651222,11.3018396 -5.14771143e-17,10.6552671 8.85964938e-17,8.8722296 L3.64820576e-16,5.1277704 C-2.11971261e-16,3.34473292 0.185651222,2.69816044 0.534265408,2.04630859 C0.882879593,1.39445674 1.39445674,0.882879593 2.04630859,0.534265408 C2.69816044,0.185651222 3.34473292,-1.13535777e-17 5.1277704,1.95404733e-17 Z" id="Rectangle" stroke="#FFFFFF" stroke-linecap="round" stroke-linejoin="round" fill-rule="nonzero"></path>
<path d="M6.24155915,0 L6.28205146,0 L7.71794886,0 L7.75844117,0 C9.07788729,0 10.1229334,0 10.9408206,0.10993981 C11.7825436,0.223110064 12.463877,0.461555186 13.0011898,0.998839277 C13.5384308,1.53613055 13.7768615,2.21742079 13.8900821,3.05916538 C14,3.87705254 14,4.92211303 14,6.24148736 L14,6.24155915 L14,6.28205146 L14,7.71794886 L14,7.75844117 L14,7.75851297 C14,9.07788729 14,10.1229334 13.8900821,10.9408206 C13.7768615,11.7825436 13.5384308,12.463877 13.0011898,13.0011898 C12.463877,13.5384308 11.7825436,13.7768615 10.9408206,13.8900821 C10.1229334,14 9.07788729,14 7.75851297,14 L7.75844117,14 L7.71794886,14 L6.28205146,14 L6.24155915,14 L6.24148736,14 C4.92211303,14 3.87705254,14 3.05916538,13.8900821 C2.21742079,13.7768615 1.53613055,13.5384308 0.998839277,13.0011898 C0.461555186,12.463877 0.223110064,11.7825436 0.10993981,10.9408206 C0,10.1229334 0,9.07788729 0,7.75844117 L0,7.71794886 L0,6.28205146 L0,6.24155915 C0,4.92212739 0,3.87705972 0.10993981,3.05916538 C0.223110064,2.21742079 0.461555186,1.53613055 0.998839277,0.998839277 C1.53613055,0.461555186 2.21742079,0.223110064 3.05916538,0.10993981 C3.87705972,0 4.92212739,0 6.24155915,0 Z M3.20266179,1.17726389 C2.48034078,1.27437363 2.06418182,1.45649568 1.76033875,1.76033875 C1.45649568,2.06418182 1.27437363,2.48034078 1.17726389,3.20266179 C1.07806492,3.94046895 1.07692338,4.91305252 1.07692338,6.28205146 L1.07692338,7.71794886 C1.07692338,8.48055397 1.07727517,9.12017447 1.09473568,9.66466677 C1.1043203,9.96340522 1.4550885,10.1026155 1.67473772,9.89986676 L3.17764128,8.51257449 C4.1366269,7.62734374 4.6161233,7.18472836 5.19402892,7.19628734 C5.77194891,7.20784631 6.23337454,7.66927194 7.15615401,8.59205141 L7.1562258,8.5921232 L8.1820309,9.61792831 C8.32224629,9.75814369 8.54976423,9.7579283 8.68997961,9.61771292 L8.68997961,9.61771292 L8.68997961,9.61771292 C9.59050267,8.71718987 10.0407283,8.26696424 10.6074052,8.24808218 C11.1740103,8.22920013 11.6532411,8.64855397 12.6117026,9.48718985 L12.6117026,9.48718985 C12.7282257,9.58913856 12.9100821,9.51009241 12.9132411,9.35530267 C12.9228616,8.88138473 12.9230769,8.34069757 12.9230769,7.71794886 L12.9230769,6.28205146 C12.9230769,4.91305252 12.9219282,3.94046895 12.8227077,3.20266179 C12.7256411,2.48034078 12.5434975,2.06418182 12.2396616,1.76033875 C11.9358257,1.45649568 11.5196308,1.27437363 10.7973744,1.17726389 C10.0595386,1.07806492 9.08693345,1.07692338 7.71794886,1.07692338 L6.28205146,1.07692338 C4.91305252,1.07692338 3.94046895,1.07806492 3.20266179,1.17726389 Z M10.5897437,4.84615406 C10.5897437,5.63920019 9.9468924,6.28205146 9.15384627,6.28205146 C8.36080013,6.28205146 7.71794886,5.63920019 7.71794886,4.84615406 C7.71794886,4.05312946 8.36080013,3.41025666 9.15384627,3.41025666 C9.9468924,3.41025666 10.5897437,4.05312946 10.5897437,4.84615406 Z" id="Shape" fill="#FFFFFF"></path>
</g>
</g>
</g>
</g>
</g>
</svg>

После

Ширина:  |  Высота:  |  Размер: 4.5 KiB

Просмотреть файл

@ -0,0 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="25px" height="25px" viewBox="0 0 25 25" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>gear</title>
<g id="Pages" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd">
<g id="Example-Web-App" transform="translate(-1660.000000, -223.000000)" fill="#FFFFFF" fill-rule="nonzero">
<g id="Toggle" transform="translate(1650.000000, 213.000000)">
<g id="Group-5" transform="translate(10.000000, 10.000000)">
<path d="M19.5133782,14.1386667 L18.4524523,13.2666667 C18.4844983,13.0153333 18.5069986,12.7606667 18.5069986,12.5006667 C18.5069986,12.2406667 18.4844983,11.986 18.4524523,11.7346667 L19.5133782,10.8626667 C20.0172498,10.4486667 20.1474791,9.73066667 19.8208831,9.16666667 L19.0510982,7.83533333 C18.7245021,7.27066667 18.0379003,7.02466667 17.4269815,7.25333333 L16.1362793,7.73666667 C15.730591,7.42733333 15.288084,7.164 14.8114856,6.96466667 L14.586482,5.61533333 C14.478753,4.97133333 13.9216988,4.5 13.2691884,4.5 L11.7303004,4.5 C11.0777901,4.5 10.5214177,4.97133333 10.4136887,5.614 L10.1886851,6.96333333 C9.71140483,7.16266667 9.26889782,7.426 8.86389141,7.73533333 L7.57250733,7.25266667 C6.96158856,7.024 6.27430495,7.27 5.9483907,7.83466667 L5.17860579,9.166 C4.85269154,9.73 4.98292087,10.4473333 5.48679248,10.8613333 L6.54771837,11.7333333 C6.51499058,11.9853333 6.49317205,12.24 6.49317205,12.5 C6.49317205,12.76 6.51567241,13.0146667 6.54771837,13.266 L5.48679248,14.138 C4.98292087,14.552 4.85269154,15.27 5.17928762,15.834 L5.94907253,17.1653333 C6.27566861,17.73 6.96227039,17.976 7.57318915,17.7473333 L8.86389141,17.264 C9.26957965,17.5733333 9.71208665,17.8366667 10.1886851,18.036 L10.4136887,19.3853333 C10.5214177,20.0286667 11.0784719,20.5 11.7309823,20.5 L13.2698703,20.5 C13.9223806,20.5 14.4794349,20.0286667 14.586482,19.386 L14.8114856,18.0366667 C15.2887659,17.8373333 15.7312729,17.574 16.1362793,17.2646667 L17.4269815,17.748 C18.0379003,17.9766667 18.7251839,17.7306667 19.05178,17.166 L19.8208831,15.8346667 C20.1474791,15.27 20.0172498,14.5526667 19.5133782,14.1386667 Z M12.5000853,15 C11.1194603,15 10.0000853,13.880625 10.0000853,12.5 C10.0000853,11.119375 11.1194603,10 12.5000853,10 C13.8807103,10 15.0000853,11.119375 15.0000853,12.5 C15.0000853,13.880625 13.8807103,15 12.5000853,15 L12.5000853,15 Z" id="Shape"></path>
</g>
</g>
</g>
</g>
</svg>

После

Ширина:  |  Высота:  |  Размер: 2.5 KiB

Просмотреть файл

@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="24px" height="24px" viewBox="0 0 24 24" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>no-check</title>
<g id="Pages" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd" opacity="0.400000006">
<g id="Example-Web-App-Sources" transform="translate(-1649.000000, -333.000000)" fill-rule="nonzero" stroke="#FFFFFF" stroke-width="2">
<g id="Group" transform="translate(1423.000000, 268.000000)">
<g id="Group-10" transform="translate(24.000000, 25.000000)">
<g id="Group-9" transform="translate(0.000000, 40.000000)">
<g id="check" transform="translate(202.000000, 0.000000)">
<circle id="Oval" cx="12" cy="12" r="9"></circle>
</g>
</g>
</g>
</g>
</g>
</g>
</svg>

После

Ширина:  |  Высота:  |  Размер: 973 B

Просмотреть файл

Просмотреть файл

@ -0,0 +1,55 @@
import React, {useCallback, useState} from 'react';
import Camera from './camera/Camera';
import Prediction from './prediction/Prediction';
import ImageSelectorButton from './staticImage/ImageSelectorButton';
import StaticImage from './staticImage/StaticImage';
// @ts-ignore
// eslint-disable-next-line import/no-webpack-loader-syntax
import ModelWorker from "workerize-loader!../model/worker";
// create our web worker instance for running the tfjs model without blocking the UI thread
const modelWorker = ModelWorker();
// the filepaths to our exported signature.json and model.json files (in the public/model folder)
const signatureFile = process.env.PUBLIC_URL + `/model/signature.json`;
const modelFile = process.env.PUBLIC_URL + `/model/model.json`;
// load our model in the web worker
modelWorker.loadModel(signatureFile, modelFile);
function App() {
// state for keeping track of our predictions -- map of {label: confidence} from running the model on an image
const [predictions, setPredictions] = useState<{[key: string]: number} | undefined>(undefined);
// state for using a static image from file picker
const [imageFile, setImageFile] = useState<File | null>(null);
// function to run the image from an html canvas element through our model
const predictCanvas = useCallback((canvas: HTMLCanvasElement) => {
// get the canvas context
const ctx = canvas.getContext('2d');
if (ctx) {
// get the pixel data from the full canvas
const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height);
// run the async predict function and set the values to our state
modelWorker.predict(imageData).then((results: {Confidences: {[label: string]: number}}) => {
if (results) {
setPredictions(results.Confidences);
}
});
}
}, []);
return (
<div>
<ImageSelectorButton setImageFile={setImageFile} imageFile={imageFile} />
{
!imageFile ?
<Camera predictCanvas={predictCanvas} predictions={predictions} /> :
<StaticImage predictCanvas={predictCanvas} image={imageFile} setImageFile={setImageFile} />
}
<Prediction predictions={predictions}/>
</div>
);
}
export default App;

Просмотреть файл

@ -0,0 +1,8 @@
.blur-container {
color: #ffffff;
padding: 16px;
-webkit-backdrop-filter: blur(40px);
backdrop-filter: blur(40px);
background-color: rgba(51, 51, 51, 0.32);
min-width: 80px;
}

Просмотреть файл

@ -0,0 +1,21 @@
import React, {FunctionComponent} from "react";
import './BlurContainer.css';
type BlurContainerType = {
additionalClassname?: string
}
const BlurContainer: FunctionComponent<BlurContainerType> = ({ additionalClassname, children }) => {
// simple container with rounded corners and a blurred translucent background
let className = "blur-container";
if (!!additionalClassname) {
className = className + ` ${additionalClassname}`;
}
return (
<div className={className}>
{ children }
</div>
);
}
export default BlurContainer;

Просмотреть файл

@ -0,0 +1,12 @@
.square-button {
border-radius: 16px;
overflow: hidden;
width: 45px;
min-width: auto;
height: 45px;
margin-bottom: 10px;
padding: 0!important;
display: flex;
justify-content: center;
align-items: center;
}

Просмотреть файл

@ -0,0 +1,26 @@
import React, {FunctionComponent} from "react";
import BlurContainer from "./BlurContainer";
import "./SquareButton.css";
type SquareButtonProps = {
onClick?: () => void,
setHover?: (hovering: boolean) => void
};
const SquareButton: FunctionComponent<SquareButtonProps> = ({ onClick, setHover, children }) => {
// Square button that is inside the blur container
return (
<div
onClick={onClick ? () => onClick() : undefined}
onMouseEnter={setHover ? () => setHover(true) : undefined}
onMouseLeave={setHover ? () => setHover(false) : undefined}
>
<BlurContainer additionalClassname="square-button">
{ children }
</BlurContainer>
</div>
);
}
export default SquareButton;

Просмотреть файл

@ -0,0 +1,11 @@
#video-container {
width: 100vw;
height: 100vh;
}
video {
width: 100%;
height: 100%;
object-fit: cover;
object-position: center;
}

Просмотреть файл

@ -0,0 +1,97 @@
import React, {useEffect, useState, useRef, useCallback} from "react";
import Webcam from "react-webcam";
import SourceSelector from "./SourceSelector";
import "./Camera.css";
type CameraProps = {
predictCanvas: (canvas: HTMLCanvasElement) => void;
predictions?: { [label: string]: number };
}
// Our webcam display and capture component
function Camera({ predictCanvas, predictions }: CameraProps) {
const [devices, setDevices] = useState<MediaDeviceInfo[]>([]);
const [deviceId, setDeviceId] = useState<string | undefined>(undefined);
const [imageFlip, setImageFlip] = useState(true);
const webcamRef = useRef<Webcam>(null);
const [selectorVisible, setSelectorVisible] = useState(false);
// handle any webcam plugged into the computer
// https://github.com/mozmorris/react-webcam#show-all-cameras-by-deviceid
const handleDevices = useCallback(
(mediaDevices: MediaDeviceInfo[]) => {
// find all the webcams
const videoDevices = mediaDevices.filter(({kind}) => kind === "videoinput");
setDevices(videoDevices);
// set our initial webcam to be the first in the list
if (videoDevices.length > 0) {
setDeviceId(videoDevices[0].deviceId);
}
},[setDevices, setDeviceId]
);
useEffect(() => {
navigator.mediaDevices.enumerateDevices().then(handleDevices);
}, [handleDevices]);
// function to grab the current frame drawn on canvas from the webcam
const getCanvas: () => Promise<HTMLCanvasElement | undefined> = useCallback(async () => {
let newImage;
if (webcamRef.current) {
newImage = webcamRef.current.getCanvas();
if (newImage) {
return newImage;
}
}
}, [webcamRef]);
// helper for waiting in our loop when the camera is loading (getting the image)
const sleep = useCallback((ms: number) => {
return new Promise<NodeJS.Timeout>(function (resolve, reject) {
setTimeout(resolve, ms);
});
}, []);
// while we have the webcam mounted, predict frames as fast as we get new predictions back from the model
useEffect(() => {
getCanvas().then(async (canvas: HTMLCanvasElement | undefined) => {
let currentCanvas = canvas;
while (!currentCanvas) {
// if no canvas, wait 500ms and try again
await sleep(500);
currentCanvas = await getCanvas();
}
if (currentCanvas) {
predictCanvas(currentCanvas);
}
})
}, [sleep, predictions, deviceId, getCanvas, predictCanvas])
return (
<div id="video-container" onClick={() => setSelectorVisible(false)}>
<SourceSelector
devices={devices}
deviceId={deviceId}
setDeviceId={setDeviceId}
imageFlip={imageFlip}
setImageFlip={setImageFlip}
selectorVisible={selectorVisible}
setSelectorVisible={setSelectorVisible}
/>
<Webcam
ref={webcamRef}
screenshotFormat="image/jpeg"
forceScreenshotSourceSize={true}
screenshotQuality={1}
audio={false}
videoConstraints={{
width: {ideal: 1920},
height: {ideal: 1080},
deviceId: !!deviceId ? {exact: deviceId} : undefined
}}
mirrored={imageFlip}
/>
</div>
);
}
export default Camera;

Просмотреть файл

@ -0,0 +1,61 @@
#camera-select-button {
position: absolute;
z-index: 1;
top: 20px;
right: 20px;
display: flex;
flex-direction: column;
align-items: flex-end;
}
#gear-icon {
height: 24px;
width: 24px;
transition: transform 0.3s;
}
.gear-rotated {
transform: rotate(90deg);
}
.source-selector {
border-radius: 24px;
transition: all 0.3s;
transform-origin: top right;
transform: scaleY(0) scaleX(0.1);
opacity: 0;
}
.source-expanded {
transform: scaleY(1) scaleX(1);
opacity: 1;
}
.toggle-container {
margin-left: -4px;
margin-right: -4px;
padding: 15px 12px 14px;
border-radius: 14px;
background-color: rgba(255,255,255,.1);
}
.toggle-item-container {
display: flex;
flex-direction: row;
align-items: center;
justify-content: space-between;
margin-bottom: 18px;
}
.toggle-item-container:last-child {
margin-bottom: 0;
}
.toggle-item {
margin-left: 0!important;
margin-right: 52px!important;
}
.toggle-radio-button {
height: 24px;
width: 24px;
}

Просмотреть файл

@ -0,0 +1,59 @@
import React, {useState} from "react";
import SquareButton from "../SquareButton";
import BlurContainer from "../BlurContainer";
import SourceSelectorItem from "./SourceSelectorItem";
import gear from "../../Icons/gear.svg";
import "./SourceSelector.css";
import check from "../../Icons/check.svg";
import noCheck from "../../Icons/no-check.svg";
type SourceSelectorProps = {
devices: MediaDeviceInfo[]
deviceId?: string,
setDeviceId: (deviceId: string) => void,
imageFlip: boolean,
setImageFlip: (imageFlip: boolean) => void,
selectorVisible: boolean,
setSelectorVisible: (visible: boolean) => void
}
// Component for selecting the webcam source and flipping the image horizontally
function SourceSelector({devices, deviceId, setDeviceId, imageFlip, setImageFlip, selectorVisible, setSelectorVisible}: SourceSelectorProps) {
const [hovering, setHover] = useState(false);
return (
<div
id="camera-select-button"
onClick={(e) => e.stopPropagation()}
onMouseLeave={() => {setSelectorVisible(false)}}
>
<SquareButton setHover={(hovering) => {setHover(hovering); if (hovering) setSelectorVisible(true);}}>
<img id="gear-icon" src={gear} alt={"Gear Icon"} className={selectorVisible || hovering ? "gear-rotated" : undefined} />
</SquareButton>
<BlurContainer additionalClassname={`source-selector${selectorVisible ? " source-expanded" : ""}`}>
{devices.map((device, key) => (
<SourceSelectorItem
name={!!device.label ? device.label.replace(/\(.*\)/g, '') : `Device ${key + 1}`}
onSelect={() => setDeviceId(device.deviceId)}
selected={device.deviceId === deviceId}
key={device.deviceId}
/>
))}
<div className="toggle-container">
<div className="toggle-item-container">
<div className={`toggle-item source-device${(imageFlip) ? " source-selected" : ""}`}>
{"Flip Image"}
</div>
<div onClick={() => setImageFlip(!imageFlip)} className="toggle-radio-button">
<img src={imageFlip ? check : noCheck} alt={'Flip Webcam Button'} />
</div>
</div>
</div>
</BlurContainer>
</div>
)
}
export default SourceSelector;

Просмотреть файл

@ -0,0 +1,33 @@
.source-item-container {
display: flex;
flex-direction: row;
align-items: center;
justify-content: space-between;
margin-bottom: 18px;
}
.source-item-container:first-child {
margin-top: 10px;
}
.source-device {
font-size: 16px;
font-weight: 500;
font-stretch: normal;
font-style: normal;
line-height: 1.31;
letter-spacing: normal;
color: #ffffff;
opacity: 0.6;
margin-left: 8px;
margin-right: 44px;
}
.source-selected {
opacity: 1;
}
.source-radio-button {
margin-right: 8px;
height: 24px;
width: 24px;
}

Просмотреть файл

@ -0,0 +1,29 @@
import React from "react";
import check from "../../Icons/check.svg";
import noCheck from "../../Icons/no-check.svg";
import "./SourceSelectorItem.css";
type SourceSelectorItemProps = {
name: string,
selected: boolean,
onSelect: () => void
}
// Component for selecting the webcam source and flipping the image horizontally
function SourceSelectorItem({selected, onSelect, name}: SourceSelectorItemProps) {
return (
<div key={name} className="source-item-container" >
<div className={`source-device${(selected) ? " source-selected" : ""}`}>
{name}
</div>
<div onClick={() => onSelect()} className="source-radio-button">
<img src={selected ? check : noCheck} alt={'Select device'} />
</div>
</div>
)
}
export default SourceSelectorItem;

Просмотреть файл

@ -0,0 +1,9 @@
.prediction-container {
border-radius: 39px;
position: absolute;
z-index: 1;
bottom: 40px;
left: 40px;
font-size: 38px;
font-weight: 500;
}

Просмотреть файл

@ -0,0 +1,34 @@
import React from "react";
import BlurContainer from "../BlurContainer";
import PredictionEntry from "./PredictionEntry";
import "./Prediction.css";
type PredictionProps = {
predictions?: { [label: string]: number },
top?: number
}
function Prediction({predictions, top=3}: PredictionProps) {
// display the top N (default 3) predictions returned from the model
let sortedPredictions: Array<[string, number]> | undefined;
if (!!predictions) {
// sort our predictions by the confidence value and take the top N
sortedPredictions = Object.entries(predictions)
.sort((a, b) => b[1] - a[1])
.slice(0, top);
}
return (
<div id="predictions">
<BlurContainer additionalClassname="prediction-container">
{!!sortedPredictions ?
sortedPredictions.map(([label, confidence], idx) => (
<PredictionEntry key={label} label={label} confidence={confidence} top={idx===0} />
))
: <PredictionEntry label={'Loading...'} />
}
</BlurContainer>
</div>
);
}
export default Prediction;

Просмотреть файл

@ -0,0 +1,28 @@
.prediction-entry {
position: relative;
padding: 0 144px 0 16px;
margin-bottom: 12px;
line-height: 58px;
height: 58px;
}
.prediction-entry:last-child {
margin-bottom: 0;
}
.prediction-bar {
position: absolute;
z-index: -1;
left: 0;
top: 0;
height: 58px;
-webkit-backdrop-filter: blur(10px);
backdrop-filter: blur(10px);
background-color: rgba(255, 255, 255, 0.2);
border-radius: 23px;
min-width: 46px;
transition: width .3s linear;
}
.prediction-green {
background-color: #00ddb3;
}

Просмотреть файл

@ -0,0 +1,26 @@
import React from "react";
import "./PredictionEntry.css";
type PredictionEntryProps = {
label: string
confidence?: number
top?: boolean
}
function PredictionEntry({label, confidence, top}: PredictionEntryProps) {
// render the predicted label and a bar representing the confidence value
// make the top confidence value green
return (
<div key={label} className="prediction-entry">
{label}
{!!confidence ?
<div
className={"prediction-bar" + (top ? " prediction-green" : "")}
style={{width: (confidence*100).toString() + "%"}}
/>
: null}
</div>
);
}
export default PredictionEntry;

Просмотреть файл

@ -0,0 +1,6 @@
#image-close-button {
position: absolute;
z-index: 1;
top: 20px;
right: 20px;
}

Просмотреть файл

@ -0,0 +1,27 @@
import React from "react";
import SquareButton from "../SquareButton";
import close from "../../Icons/close.svg";
import "./ImageCloseButton.css";
type ImageCloseButtonProps = {
setImageFile: (image: File | null) => void;
};
// Component for clearing the static image and going back to the webcam view
function ImageCloseButton({setImageFile}: ImageCloseButtonProps) {
const onClick = () => {
setImageFile(null);
};
return (
<div id="image-close-button">
<SquareButton onClick={onClick}>
<img id="close-icon" src={close} alt={"Close"} />
</SquareButton>
</div>
)
}
export default ImageCloseButton;

Просмотреть файл

@ -0,0 +1,6 @@
#image-select-button {
position: absolute;
z-index: 1;
top: 20px;
left: 20px;
}

Просмотреть файл

@ -0,0 +1,55 @@
import React, {useEffect, useRef} from "react";
import SquareButton from "../SquareButton";
import gallery from "../../Icons/gallery.svg";
import "./ImageSelectorButton.css";
type ImageSelectorButtonProps = {
setImageFile: (image: File | null) => void;
imageFile: File | null;
};
// Component for selecting an image file for prediction
function ImageSelectorButton({setImageFile, imageFile}: ImageSelectorButtonProps) {
// ref to the hidden file input element
const fileInput = useRef<HTMLInputElement>(null);
// if we have a null file (from clearing the image), clear the file input value
useEffect(
() => {
if (!imageFile && fileInput.current) {
fileInput.current.value = "";
}
},
[imageFile, fileInput]
)
// make an onclick that will open the file dialog
const onClick = () => {
if (fileInput.current) {
fileInput.current.click();
}
}
// set our image file from the picker
const onChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const files = e.target.files;
if (files && files.length > 0) {
setImageFile(files[0]);
}
}
return (
<div
id="image-select-button"
onClick={onClick}
>
<SquareButton>
<img id="gallery-icon" src={gallery} alt={"File Selector"} />
</SquareButton>
<input ref={fileInput} type="file" accept="image/*" onChange={onChange} style={{display: "none"}} />
</div>
)
}
export default ImageSelectorButton;

Просмотреть файл

@ -0,0 +1,7 @@
#static-image {
width: 100vw;
height: 100vh;
background-position: center;
background-size: contain;
background-repeat: no-repeat;
}

Просмотреть файл

@ -0,0 +1,58 @@
import React, {useEffect, useRef} from "react";
import ImageCloseButton from "./ImageCloseButton";
import "./StaticImage.css";
type StaticImageProps = {
predictCanvas: (canvas: HTMLCanvasElement) => void;
image: File;
setImageFile: (image: File | null) => void;
}
// Component for displaying our selected image file for prediction
function StaticImage({ predictCanvas, image, setImageFile }: StaticImageProps) {
// display our image file on a canvas and call the predict function with that canvas
const canvas = useRef<HTMLCanvasElement>(null);
const display = useRef<HTMLDivElement>(null);
useEffect(() => {
const reader = new FileReader();
reader.onload = (e) => {
// make an image to draw on the canvas
const img = new Image();
img.onload = () => {
if (canvas.current) {
// draw the image on our canvas
canvas.current.width = img.width;
canvas.current.height = img.height;
const ctx = canvas.current.getContext("2d");
if (!!ctx) {
// draw our image on the context
ctx.drawImage(img,0,0);
// drawing is finished, run the prediction!
predictCanvas(canvas.current);
}
}
}
// load the image from our reader
if (e.target) {
img.src = e.target.result as string;
if (display.current) {
display.current.style.backgroundImage = `url(${e.target.result})`;
}
}
}
// read our image file and process it!
reader.readAsDataURL(image);
}, [image, predictCanvas, display]);
return (
<div id="static-image" ref={display}>
<ImageCloseButton setImageFile={setImageFile} />
<canvas ref={canvas} style={{display: "none"}} />
</div>
)
}
export default StaticImage;

Просмотреть файл

@ -0,0 +1,13 @@
body {
margin: 0;
font-family: SFProDisplay, -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
sans-serif;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
code {
font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
monospace;
}

Просмотреть файл

@ -0,0 +1,17 @@
import React from 'react';
import ReactDOM from 'react-dom';
import './index.css';
import App from './components/App';
import reportWebVitals from './reportWebVitals';
ReactDOM.render(
<React.StrictMode>
<App />
</React.StrictMode>,
document.getElementById('root')
);
// If you want to start measuring performance in your app, pass a function
// to log results (for example: reportWebVitals(console.log))
// or send to an analytics endpoint. Learn more: https://bit.ly/CRA-vitals
reportWebVitals();

Просмотреть файл

@ -0,0 +1,117 @@
import * as tf from '@tensorflow/tfjs';
const LEGACY_VERSION = -1;
const SUPPORTED_VERSIONS = [LEGACY_VERSION, 1]; // use -1 for legacy Lobe exports without the version property
class ImageClassificationModel {
signaturePath: string;
signature: any;
modelPath: string;
height: number = 224;
width: number = 224;
outputName: string = '';
inputKey = "Image";
outputKey = "Confidences";
labelKey = "Label";
labels: string[] = [];
version?: number;
model?: tf.GraphModel;
constructor(signaturePath: string, modelPath: string) {
/* Construct our model from the path to Lobe's exported signature.json and model.json files */
this.signaturePath = signaturePath;
this.modelPath = modelPath;
}
async load() {
/* Load our TensorFlow.js GraphModel */
const signatureFile = await fetch(this.signaturePath);
this.signature = await signatureFile.json();
[this.width, this.height] = this.signature.inputs[this.inputKey].shape.slice(1,3);
this.outputName = this.signature.outputs[this.outputKey].name;
this.labels = this.signature.classes[this.labelKey];
this.version = this.signature.export_model_version || LEGACY_VERSION;
if (!this.version || !SUPPORTED_VERSIONS.includes(this.version)) {
const versionMessage = `The model version ${this.version} you are using for this starter project may not be compatible with the supported versions ${SUPPORTED_VERSIONS}. Please update both this starter project and Lobe to latest versions, and try exporting your model again. If the issue persists, please contact us at lobesupport@microsoft.com`;
console.error(versionMessage);
throw new Error(versionMessage);
}
this.model = await tf.loadGraphModel(this.modelPath);
}
dispose() {
/* Free up the memory used by the TensorFlow.js GraphModel */
if (this.model) {
this.model.dispose();
this.model = undefined;
}
}
async predict(imageData: ImageData) {
/*
Given an input image data from a Canvas,
preprocess the image into a tensor with pixel values of [0,1], center crop to a square
and resize to the image input size, then run the prediction!
*/
if (this.model) {
// use tf tidy to dispose of intermediate tensors automatically
const confidencesTensor = tf.tidy(() => {
// create a tensor from the canvas image data
const image = tf.browser.fromPixels(imageData);
const [imgHeight, imgWidth] = image.shape.slice(0,2);
// convert image to 0-1
const normalizedImage = tf.div(image, tf.scalar(255));
// make into a batch of 1 so it is shaped [1, height, width, 3]
const batchImage: tf.Tensor4D = tf.expandDims(normalizedImage);
// center crop and resize
/*
Instead of center cropping, you can use any number of methods for making the image square and the right shape.
You can resize (squeeze or expand height/width to fit), pad with 0's so that the whole image is square and has black bars,
or pad with different pixel values like mirroring. We recommend using the same resize function here that was used during
training or the creation of your dataset. Lobe by default with center crop to the square.
*/
let top = 0;
let left = 0;
let bottom = 1;
let right = 1;
if (imgHeight !== imgWidth) {
// the crops are normalized 0-1 percentage of the image dimension
const size = Math.min(imgHeight, imgWidth);
left = (imgWidth - size) / 2 / imgWidth;
top = (imgHeight - size) / 2 / imgHeight;
right = (imgWidth + size) / 2 / imgWidth;
bottom = (imgHeight + size) / 2 / imgHeight;
}
// center crop our image and resize it to the size found in signature.json
const croppedImage = tf.image.cropAndResize(
batchImage, [[top, left, bottom, right]], [0], [this.height, this.width]
);
// run the model on our image and await the results as an array
if (this.model) {
return this.model.execute(
{[this.signature.inputs[this.inputKey].name]: croppedImage}, this.outputName
);
}
}) as (tf.Tensor | undefined);
if (confidencesTensor) {
// grab the array of values from the tensor data
const confidencesArray = await confidencesTensor.data();
// now that we have the array values, we can dispose the tensor and free memory
confidencesTensor.dispose();
// return a map of [label]: confidence computed by the model
// the list of labels maps in the same index order as the outputs from the results
return {
[this.outputKey]: this.labels.reduce(
(returnConfidences, label, idx) => {
return {[label]: confidencesArray[idx], ...returnConfidences}
}, {}
)
}
}
} else {
console.error("Model not loaded, please await this.load() first.");
}
}
}
export default ImageClassificationModel

Просмотреть файл

@ -0,0 +1,29 @@
/*
This file is our web worker to run the TensorFlow.js models asynchronously and not block the UI thread.
*/
import ImageClassificationModel from "./imageClassificationModel";
let model: ImageClassificationModel;
export async function loadModel(signaturePath: string, modelPath: string) {
// loads our exported Lobe model from the signature and model files
disposeModel();
model = new ImageClassificationModel(signaturePath, modelPath);
await model.load();
}
export function disposeModel() {
// frees up memory used by the model
if (model) {
model.dispose();
}
}
export async function predict(data: ImageData) {
// run the input data through the model
if (model) {
return await model.predict(data);
} else {
console.log('Predict called without model loaded.')
}
}

1
apps/web-bootstrap/src/react-app-env.d.ts поставляемый Normal file
Просмотреть файл

@ -0,0 +1 @@
/// <reference types="react-scripts" />

Просмотреть файл

@ -0,0 +1,15 @@
import { ReportHandler } from 'web-vitals';
const reportWebVitals = (onPerfEntry?: ReportHandler) => {
if (onPerfEntry && onPerfEntry instanceof Function) {
import('web-vitals').then(({ getCLS, getFID, getFCP, getLCP, getTTFB }) => {
getCLS(onPerfEntry);
getFID(onPerfEntry);
getFCP(onPerfEntry);
getLCP(onPerfEntry);
getTTFB(onPerfEntry);
});
}
};
export default reportWebVitals;

Просмотреть файл

@ -0,0 +1,5 @@
// jest-dom adds custom jest matchers for asserting on DOM nodes.
// allows you to do things like:
// expect(element).toHaveTextContent(/react/i)
// learn more: https://github.com/testing-library/jest-dom
import '@testing-library/jest-dom';

Просмотреть файл

@ -0,0 +1,26 @@
{
"compilerOptions": {
"target": "es5",
"lib": [
"dom",
"dom.iterable",
"esnext"
],
"allowJs": true,
"skipLibCheck": true,
"esModuleInterop": true,
"allowSyntheticDefaultImports": true,
"strict": true,
"forceConsistentCasingInFileNames": true,
"noFallthroughCasesInSwitch": true,
"module": "esnext",
"moduleResolution": "node",
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true,
"jsx": "react-jsx"
},
"include": [
"src"
]
}

11593
apps/web-bootstrap/yarn.lock Normal file

Разница между файлами не показана из-за своего большого размера Загрузить разницу