bobox's picture
trained on the initial 100k + 100k
177db6c verified
metadata
language: []
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:300000
  - loss:DenoisingAutoEncoderLoss
base_model: FacebookAI/roberta-base
datasets: []
metrics:
  - pearson_cosine
  - spearman_cosine
  - pearson_manhattan
  - spearman_manhattan
  - pearson_euclidean
  - spearman_euclidean
  - pearson_dot
  - spearman_dot
  - pearson_max
  - spearman_max
widget:
  - source_sentence: >-
      free in spain? Are Spain free Motorways toll-free Spain, renewing old
      concessions coming
    sentences:
      - >-
        how to calculate weighted grade percentage in excel? To find the grade,
        multiply the grade for each assignment against the weight, and then add
        these totals all up. So for each cell (in the Total column) we will
        enter =SUM(Grade Cell * Weight Cell), so my first formula is
        =SUM(B2*C2), the next one would be =SUM(B3*C3) and so on.
      - >-
        In Red Dead Redemption 2's story mode, players have to go to "Story" in
        the menu and then click the save icon from there. However, in Red Dead
        Online, there is no such option. On the contrary, players have no way to
        manually save their game, which is pretty much par for the course in an
        online multiplayer experience.
      - >-
        are motorways free in spain? Are motorways in Spain free? Motorways are
        90% toll-free in Spain. Since 2018, Spain isn't renewing old concessions
        coming to end.
  - source_sentence: things do fort wayne?
    sentences:
      - >-
        what is the difference between a z71 and a 4x4? A Z71 has more
        undercarriage protection (more skid plates) and heavier duty shock
        absorbers/struts for off road use than a 4X4. Other than that the two
        are pretty much the same.
      - is suboxone bad for kidneys?
      - indoor things to do in fort wayne indiana?
  - source_sentence: a should hair?
    sentences:
      - how many times in a week should you shampoo your hair?
      - >-
        Sujith fell into the borewell on Friday around 5:45 pm while playing on
        the family's farm. Initially, he was trapped at a depth of 26 feet but
        slipped to 88 feet during attempts to pull him up by tying ropes around
        his hands. Sujith Wilson fell into a borewell in Tamil Nadu's Trichy on
        Friday.
      - >-
        how to calculate out retained earnings on balance sheet? The retained
        earnings are calculated by adding net income to (or subtracting net
        losses from) the previous term's retained earnings and then subtracting
        any net dividend(s) paid to the shareholders. The figure is calculated
        at the end of each accounting period (quarterly/annually.)
  - source_sentence: long period does go
    sentences:
      - >-
        if someone blocked your email will you know? You could, indeed, be
        blocked It's certainly possible that your recipient has blocked you. All
        that means is that email from your email address is automatically
        discarded at that recipient's end. You will not get a notification;
        there's simply no way to tell that this has happened.
      - is drinking apple cider vinegar every day bad for you?
      - how long after period does weight go down?
  - source_sentence: >-
      beer wine both sugar alcohol excessive be a infections You also sweets,
      along with foods moldy cheese, if you prone.
    sentences:
      - >-
        how long does it take to get xfinity internet? Installation generally
        takes between two to four hours.
      - >-
        They began selling the plush animals to retailers rather than operating
        a store themselves. Today, Boyds is a publicly traded company that
        manufactures 18 million-20 million bears a year, all at a
        government-owned facility in China.
      - >-
        Since beer and wine both contain yeast and sugar (alcohol is sugar
        fermented by yeast), excessive drinking can definitely be a recipe for
        yeast infections. You should also go easy on sweets, along with foods
        like moldy cheese, mushrooms, and anything fermented if you're prone to
        yeast infections. 3.
pipeline_tag: sentence-similarity
model-index:
  - name: SentenceTransformer based on FacebookAI/roberta-base
    results:
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts test
          type: sts-test
        metrics:
          - type: pearson_cosine
            value: 0.6885553993934473
            name: Pearson Cosine
          - type: spearman_cosine
            value: 0.6912117328249255
            name: Spearman Cosine
          - type: pearson_manhattan
            value: 0.6728262252927975
            name: Pearson Manhattan
          - type: spearman_manhattan
            value: 0.6724759418767672
            name: Spearman Manhattan
          - type: pearson_euclidean
            value: 0.6693578420498989
            name: Pearson Euclidean
          - type: spearman_euclidean
            value: 0.6690698040856067
            name: Spearman Euclidean
          - type: pearson_dot
            value: 0.18975985891617667
            name: Pearson Dot
          - type: spearman_dot
            value: 0.1786146878048478
            name: Spearman Dot
          - type: pearson_max
            value: 0.6885553993934473
            name: Pearson Max
          - type: spearman_max
            value: 0.6912117328249255
            name: Spearman Max

SentenceTransformer based on FacebookAI/roberta-base

This is a sentence-transformers model finetuned from FacebookAI/roberta-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: FacebookAI/roberta-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/RoBERTa-base-unsupervised-TSDAE")
# Run inference
sentences = [
    'beer wine both sugar alcohol excessive be a infections You also sweets, along with foods moldy cheese, if you prone.',
    "Since beer and wine both contain yeast and sugar (alcohol is sugar fermented by yeast), excessive drinking can definitely be a recipe for yeast infections. You should also go easy on sweets, along with foods like moldy cheese, mushrooms, and anything fermented if you're prone to yeast infections. 3.",
    'They began selling the plush animals to retailers rather than operating a store themselves. Today, Boyds is a publicly traded company that manufactures 18 million-20 million bears a year, all at a government-owned facility in China.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine 0.6886
spearman_cosine 0.6912
pearson_manhattan 0.6728
spearman_manhattan 0.6725
pearson_euclidean 0.6694
spearman_euclidean 0.6691
pearson_dot 0.1898
spearman_dot 0.1786
pearson_max 0.6886
spearman_max 0.6912

Training Details

Training Dataset

Unnamed Dataset

  • Size: 300,000 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 3 tokens
    • mean: 19.88 tokens
    • max: 54 tokens
    • min: 8 tokens
    • mean: 46.45 tokens
    • max: 157 tokens
  • Samples:
    sentence_0 sentence_1
    us have across domestic shorthair, a cat pedigreed one between two breeds Unlike domestic shorthairs which come in of looks, Shorthair kittens the distinct Most of us have either lived with or come across a domestic shorthair, a cat that closely resembles the pedigreed American Shorthair. The one difference between the two breeds: Unlike domestic shorthairs, which come in a variety of looks, the American Shorthair produces kittens with the same distinct appearance.
    much cost to get plugs normal with plugs, cost start $120 or if precious plugs are $150 to 200+ . 6 8 will price more required how much does it cost to get your spark plugs changed? On a normal 4-cylinder engine with standard spark plugs, replacement cost can start around $120 up to $150+, or if precious metal spark plugs are required, $150 up to $200+. 6 cylinder and 8 Cylinder engines will increase in price, as more spark plugs are required.
    much my paycheck state income%, your income level not tax rate you is of just that a flat tax rate, those, it has the how much taxes are taken out of my paycheck pa? Pennsylvania levies a flat state income tax rate of 3.07%. Therefore, your income level and filing status will not affect the income tax rate you pay at the state level. Pennsylvania is one of just eight states that has a flat income tax rate, and of those states, it has the lowest rate.
  • Loss: DenoisingAutoEncoderLoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 12
  • per_device_eval_batch_size: 12
  • num_train_epochs: 1
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 12
  • per_device_eval_batch_size: 12
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss sts-test_spearman_cosine
0.02 500 7.1409 -
0.04 1000 6.207 -
0.05 1250 - 0.6399
0.06 1500 5.8038 -
0.08 2000 5.4963 -
0.1 2500 5.2609 0.6799
0.12 3000 5.0997 -
0.14 3500 5.0004 -
0.15 3750 - 0.7012
0.16 4000 4.8694 -
0.18 4500 4.7805 -
0.2 5000 4.6776 0.7074
0.22 5500 4.5757 -
0.24 6000 4.4598 -
0.25 6250 - 0.7185
0.26 6500 4.3865 -
0.28 7000 4.2692 -
0.3 7500 4.2224 0.7205
0.32 8000 4.1347 -
0.34 8500 4.0536 -
0.35 8750 - 0.7239
0.36 9000 4.0242 -
0.38 9500 4.0193 -
0.4 10000 3.9166 0.7153
0.42 10500 3.9004 -
0.44 11000 3.8372 -
0.45 11250 - 0.7141
0.46 11500 3.8037 -
0.48 12000 3.7788 -
0.5 12500 3.7191 0.7078
0.52 13000 3.7036 -
0.54 13500 3.6697 -
0.55 13750 - 0.7095
0.56 14000 3.6629 -
0.58 14500 3.639 -
0.6 15000 3.6048 0.7060
0.62 15500 3.6072 -
0.64 16000 3.574 -
0.65 16250 - 0.7056
0.66 16500 3.5423 -
0.68 17000 3.5379 -
0.7 17500 3.5222 0.6969
0.72 18000 3.5076 -
0.74 18500 3.5025 -
0.75 18750 - 0.6959
0.76 19000 3.4943 -
0.78 19500 3.475 -
0.8 20000 3.4874 0.6946
0.82 20500 3.4539 -
0.84 21000 3.4704 -
0.85 21250 - 0.6942
0.86 21500 3.4689 -
0.88 22000 3.4617 -
0.9 22500 3.4471 0.6917
0.92 23000 3.4541 -
0.94 23500 3.4394 -
0.95 23750 - 0.6915
0.96 24000 3.4505 -
0.98 24500 3.4533 -
1.0 25000 3.4574 0.6912

Framework Versions

  • Python: 3.10.13
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.2
  • Accelerate: 0.31.0
  • Datasets: 2.19.2
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

DenoisingAutoEncoderLoss

@inproceedings{wang-2021-TSDAE,
    title = "TSDAE: Using Transformer-based Sequential Denoising Auto-Encoderfor Unsupervised Sentence Embedding Learning",
    author = "Wang, Kexin and Reimers, Nils and Gurevych, Iryna", 
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
    month = nov,
    year = "2021",
    address = "Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    pages = "671--688",
    url = "https://arxiv.org/abs/2104.06979",
}