Introducing G2.ai, the future of software buying.Try now
BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering
Save to My Lists
Unclaimed
Unclaimed

BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering Reviews & Product Details

BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering Overview

What is BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

This is a Extractive Question Answering model built upon a Text Embedding model from [PyTorch Hub](https://pytorch.org/hub/huggingface_pytorch-transformers/ ). It takes as input a pair of question-context strings, and returns a sub-string from the context as a answer to the question. The Text Embedding model which is pre-trained on Multilingual Wikipedia returns an embedding of the input pair of question-context strings.

BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering Details
Show LessShow More
Product Description

This is a Extractive Question Answering model built upon a Text Embedding model from [PyTorch Hub](https://pytorch.org/hub/huggingface_pytorch-transformers/ ). It takes as input a pair of question-context strings, and returns a sub-string from the context as a answer to the question. The Text Embedding model which is pre-trained on Multilingual Wikipedia returns an embedding of the input pair of question-context strings.


Seller

Amazon Web Services (AWS)

Description

By giving customers more of what they want - low prices, vast selection, and convenience - Amazon continues to grow and evolve as a world-class e-commerce platform.

Recent BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering Reviews

Verified User
I
Verified UserEnterprise (> 1000 emp.)
4.5 out of 5
"BERT BASE - Works Perfectly well"
Language Model Tokenizer. It works well with all sets of data and in all generic industries.
Rishika J.
RJ
Rishika J.Mid-Market (51-1000 emp.)
4.0 out of 5
"BERT: A question answering model by PyTorch"
One of the best parts about this particular PyTorch transformer is its support for more than 100 languages. BERT is integrated with the most effici...
Tarang N.
TN
Tarang N.Mid-Market (51-1000 emp.)
4.0 out of 5
"BERT: A unicase for Multilingual Base Model"
BERT Base Multilingual Uncased PyTorch Hub is a transformer model as it helps the computer to understand the multilingual data of different languag...

BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering Media

Answer a few questions to help the BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering community
Have you used BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering before?
Yes

5 BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering Reviews

4.1 out of 5
The next elements are filters and will change the displayed results once they are selected.
Search reviews
Hide FiltersMore Filters
The next elements are filters and will change the displayed results once they are selected.
The next elements are filters and will change the displayed results once they are selected.
G2 reviews are authentic and verified.
Rishika J.
RJ
Software Engineer II
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

One of the best parts about this particular PyTorch transformer is its support for more than 100 languages. BERT is integrated with the most efficient neural networks, training objectives, and transfer learning. It is a pre-trained model with highly accurate tuning trained on different datasets available like SQUAD. It answers the questions concisely and even helps in other use cases like highlighting paragraphs with crucial entry points when a question is asked. Review collected by and hosted on G2.com.

What do you dislike about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

The accuracy and the vast support for large datasets for different languages make BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering an expensive model. Due to the large dataset, this model is a bit slow to train, requires updating a lot of weights, and takes more computation time. Review collected by and hosted on G2.com.

What problems is BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering solving and how is that benefiting you?

With BERT, my organization aimed to understand and learn implementations of Natural Language Processing (NLP) in daily use cases. We used this model to help answer the frequently asked questions by the customers from the context documentation. Since the model supports so many languages and is trained using vast datasets, it really helped to concisely answer the questions from the context provided even when it was not present in the data set it used while training. Review collected by and hosted on G2.com.

Jagadis P.
JP
Product Specialist (Order to Cash)
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

Pytorch BERT is one of the most Extractive question answering tools which is created on a text embedding idealogy. This takes as input a pair of question-setup strings and returns a related contextual sub-module string that more or less matches the exact context of the real answer to the question. The best part of this setup is it is based on a pre-trained on multilingual setup which helps in returning question-context strings. Review collected by and hosted on G2.com.

What do you dislike about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

AI & ML is doing marvelous jobs but still, we have not achieved the level that we want. Sometimes it acts weird by returning an answer or string which is related to the question in a vocabulary way but not a contextual way. This can be set aside as an exception because these are very rare instances where your utterances are not set properly. Review collected by and hosted on G2.com.

What problems is BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering solving and how is that benefiting you?

One of the best things that this tool is helping us with is its multilingual setup. Also, its pre-trained model set up in multiple languages is helping with masked language modeling. This is very strong, as it also helps in checking raw or unstructured setup. Review collected by and hosted on G2.com.

Tarang N.
TN
Systems Associate - Trainee
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

BERT Base Multilingual Uncased PyTorch Hub is a transformer model as it helps the computer to understand the multilingual data of different languages into one unicase form and predict the next sentence for the betterment with the help of artificial intelligence and then randomly masking some part of the words and run it to complete the whole sentence. Review collected by and hosted on G2.com.

What do you dislike about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

There isn't any thing that I don't like about BERT Base Multilingual but it is primarily worked for fine tuned on task that use the whole sentence to make decision and sequence classification. Review collected by and hosted on G2.com.

What problems is BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering solving and how is that benefiting you?

There are many benifits of BEET Base Multilingual Uncased as it helps in predicting the next sentence for the betterment and also for predicting the next words in the sentence to predict the masked words. Review collected by and hosted on G2.com.

Verified User in Information Technology and Services
UI
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

BERT is a multilingual base model, it is trained over 102 languages. The advantage of the model is that it is uncased. One can easily access it using pytorch library. The model aims to fine tuned the tasks which depends on whole sentences. Review collected by and hosted on G2.com.

What do you dislike about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

The model seems to be quite efficient and effective. Didn't find any drawback Review collected by and hosted on G2.com.

What problems is BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering solving and how is that benefiting you?

It helps to make decisions such as token classification, sequence classification or question answering. Can be used to train classifiers. Easy access using pip. Review collected by and hosted on G2.com.

Verified User in Information Technology and Services
II
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

Language Model Tokenizer. It works well with all sets of data and in all generic industries. Review collected by and hosted on G2.com.

What do you dislike about BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering?

Difficult to accomplish tasks in a limited time. Time consuming. Review collected by and hosted on G2.com.

What problems is BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering solving and how is that benefiting you?

With use of Count vectors featurizer, Language model featurizer we have computed large sets of data and we have created models with Pretrained embeddings. Cost optimization we have derived. Review collected by and hosted on G2.com.