nlp-recipes/scenarios
Said Bleik d7d9290cf3
Merge pull request #185 from microsoft/liqun-qa-bert
Add BERT for QA to Readme and fix broken links
2019-07-22 19:58:11 -04:00
..
embeddings notebook edits 2019-06-26 11:04:10 -04:00
entailment updated imports - notebooks 2019-07-15 11:05:53 -04:00
interpret_NLP_models merge two notebooks 2019-07-17 11:03:47 +08:00
named_entity_recognition updated imports - notebooks 2019-07-15 11:05:53 -04:00
question_answering fix the project repo 2019-07-22 18:39:29 -04:00
sentence_similarity remove senteval notebooks and integration tests 2019-07-19 18:26:55 +00:00
text_classification updated imports - notebooks 2019-07-15 11:05:53 -04:00
README.md Merge branch 'staging' into google-use-outputs 2019-07-05 10:02:43 -04:00

README.md

NLP Scenarios

This folder contains examples and best practices, written in Jupyter notebooks, for building Natural Language Processing systems for different scenarios.

Summary

The following is a summary of the scenarios covered in the best practice notebooks. Each scenario is demonstrated in one or more Jupyter notebook examples that make use of the core code base of models and utilities.

Scenario Applications Models
Text Classification Topic Classification BERT
Named Entity Recognition Wikipedia NER BERT
Entailment XNLI Natural Language Inference BERT
Question Answering SQuAD BiDAF
Sentence Similarity STS Benchmark Representation: TF-IDF, Word Embeddings, Doc Embeddings
Metrics: Cosine Similarity, Word Mover's Distance
Embeddings Custom Embeddings Training Word2Vec
fastText
GloVe

Azure-enhanced notebooks

Azure products and services are used in certain notebooks to enhance the efficiency of developing Natural Language systems at scale.

To successfully run these notebooks, the users need an Azure subscription or can use Azure for free.

The Azure products featured in the notebooks include:

  • Azure Machine Learning service - Azure Machine Learning service is a cloud service used to train, deploy, automate, and manage machine learning models, all at the broad scale that the cloud provides. It is used across various notebooks for the AI model development related tasks like:

    • Using Datastores
    • Tracking and monitoring metrics to enhance the model creation process
    • Distributed Training
    • Hyperparameter tuning
    • Scaling up and out on Azure Machine Learning Compute
    • Deploying a web service to both Azure Container Instance and Azure Kubernetes Service
  • Azure Kubernetes Service - You can use Azure Machine Learning service to host your classification model in a web service deployment on Azure Kubernetes Service (AKS). AKS is good for high-scale production deployments and provides autoscaling, and fast response times.

  • Azure Container Instance- You can use Azure Machine Learning service to host your classification model in a web service deployment on Azure Container Instance (ACI). ACI is good for low scale, CPU-based workloads.

There may be other Azure service or products used in the notebooks. Introduction and/or reference of those will be provided in the notebooks.