Muennighoff commited on
Commit
a29a975
1 Parent(s): 1c8c0e7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -2
README.md CHANGED
@@ -13,7 +13,15 @@ task_categories:
13
  ---
14
  Preprocessed version of Super-Natural-Instructions from https://github.com/allenai/natural-instructions/tree/master/splits. The same inputs may appear with different outputs, thus to avoid duplicate inputs, you can deduplicate by the `id` or the `inputs` field.
15
 
16
- Tasks
17
  ```
18
- {'task096_conala_list_index_subtraction', 'task490_mwsc_options_generation', 'task1389_hellaswag_completion', 'task688_mmmlu_answer_generation_college_computer_science', 'task751_svamp_subtraction_question_answering', 'task898_freebase_qa_answer_generation', 'task844_financial_phrasebank_classification', 'task858_inquisitive_span_detection', 'task692_mmmlu_answer_generation_computer_security', 'task318_stereoset_classification_gender', 'task691_mmmlu_answer_generation_college_physics', 'task1403_check_validity_date_mmddyyyy', 'task1566_propara_structured_text_generation', 'task1384_deal_or_no_dialog_classification', 'task122_conala_list_index_addition', 'task718_mmmlu_answer_generation_machine_learning', 'task085_unnatural_addsub_arithmetic', 'task381_boolq_question_generation', 'task378_reverse_words_of_given_length', 'task001_quoref_question_generation', 'task687_mmmlu_answer_generation_college_chemistry', 'task079_conala_concat_strings', 'task1520_qa_srl_answer_generation', 'task1480_gene_extraction_jnlpba_dataset', 'task849_pubmedqa_answer_generation', 'task269_csrg_counterfactual_story_generation', 'task101_reverse_and_concatenate_all_elements_from_index_i_to_j', 'task160_replace_letter_in_a_sentence', 'task637_extract_and_sort_unique_digits_in_a_list', 'task1380_quarel_correct_option_generation', 'task724_mmmlu_answer_generation_moral_scenarios', 'task725_mmmlu_answer_generation_nutrition', 'task1448_disease_entity_extraction_ncbi_dataset', 'task712_mmmlu_answer_generation_high_school_world_history', 'task1609_xquad_en_question_generation', 'task1729_personachat_generate_next', 'task132_dais_text_modification', 'task492_mwsc_incorrect_answer_generation', 'task373_synthetic_round_tens_place', 'task324_jigsaw_classification_disagree', 'task1316_remove_duplicates_string', 'task964_librispeech_asr_text_auto_completion', 'task589_amazonfood_summary_text_generation', 'task627_xlwic_word_with_same_meaning_sentence_generation', 'task1321_country_continent', 'task578_curiosity_dialogs_answer_generation', 'task598_cuad_answer_generation', 'task694_mmmlu_answer_generation_econometrics', 'task1564_triviaqa_answer_generation', 'task1722_civil_comments_threat_classification', 'task1382_quarel_write_correct_answer', 'task509_collate_of_all_alphabetical_and_numerical_elements_in_list_separately', 'task227_clariq_classification', 'task494_review_polarity_answer_generation', 'task154_tomqa_find_location_hard_noise', 'task513_argument_stance_classification', 'task1369_healthfact_sentence_generation', 'task365_synthetic_remove_vowels', 'task090_equation_learner_algebra', 'task761_app_review_classification', 'task076_splash_correcting_sql_mistake', 'task1295_adversarial_qa_question_answering', 'task161_count_words_containing_letter', 'task165_mcscript_question_answering_commonsense', 'task1584_evalution_meronym_classification', 'task1592_yahoo_answers_topics_classfication', 'task690_mmmlu_answer_generation_college_medicine', 'task574_air_dialogue_sentence_generation', 'task1294_wiki_qa_answer_verification', 'task868_mawps_singleop_question_answering', 'task077_splash_explanation_to_sql', 'task089_swap_words_verification', 'task1311_amazonreview_rating_classification', 'task568_circa_question_generation', 'task599_cuad_question_generation', 'task1724_civil_comments_insult_classification', 'task1444_round_power_of_two', 'task398_semeval_2018_task1_tweet_joy_detection', 'task714_mmmlu_answer_generation_human_sexuality', 'task1593_yahoo_answers_topics_classification', 'task867_mawps_multiop_question_answering', 'task093_conala_normalize_lists', 'task307_jeopardy_answer_generation_final', 'task593_sciq_explanation_generation', 'task1567_propara_question_generation', 'task430_senteval_subject_count', 'task1336_peixian_equity_evaluation_corpus_gender_classifier', 'task374_synthetic_pos_or_neg_calculation', 'task1506_celebrity_minimal_dob_span', 'task1368_healthfact_sentence_generation', 'task851_synthetic_multiply_evens', 'task1704_ljspeech_textmodification', 'task819_pec_sentiment_classification', 'task350_winomt_classification_gender_identifiability_pro', 'task707_mmmlu_answer_generation_high_school_microeconomics', 'task1150_delete_max_min', 'task366_synthetic_return_primes', 'task176_break_decompose_questions', 'task297_storycloze_incorrect_end_classification', 'task283_dream_incorrect_answer_generation', 'task924_event2mind_word_generation', 'task346_hybridqa_classification', 'task369_synthetic_remove_odds', 'task636_extract_and_sort_unique_alphabets_in_a_list', 'task1559_blimp_binary_classification', 'task383_matres_classification', 'task697_mmmlu_answer_generation_formal_logic', 'task608_sbic_sexual_offense_binary_classification', 'task070_abductivenli_incorrect_classification', 'task163_count_words_ending_with_letter', 'task380_boolq_yes_no_question', 'task504_count_all_alphabetical_elements_in_list', 'task156_codah_classification_adversarial', 'task296_storycloze_correct_end_classification', 'task105_story_cloze-rocstories_sentence_generation', 'task319_stereoset_classification_profession', 'task454_swag_incorrect_answer_generation', 'task1194_kth_largest_element', 'task157_count_vowels_and_consonants', 'task1209_atomic_classification_objectuse', 'task142_odd-man-out_classification_no_category', 'task170_hotpotqa_answer_generation', 'task1398_obqa_question_generation', 'task1581_eqasc-perturbed_answer_generation', 'task708_mmmlu_answer_generation_high_school_physics', 'task686_mmmlu_answer_generation_college_biology', 'task908_dialogre_identify_familial_relationships', 'task002_quoref_answer_generation', 'task043_essential_terms_answering_incomplete_questions', 'task1580_eqasc-perturbed_question_generation', 'task066_timetravel_binary_consistency_classification', 'task1603_smcalflow_sentence_generation', 'task1510_evalution_relation_extraction', 'task286_olid_offense_judgment', 'task1678_mathqa_answer_selection', 'task664_mmmlu_answer_generation_abstract_algebra', 'task1645_medical_question_pair_dataset_text_classification', 'task871_msmarco_question_generation', 'task1486_cell_extraction_anem_dataset', 'task1201_atomic_classification_xintent', 'task1607_ethos_text_classification', 'task321_stereoset_classification_religion', 'task848_pubmedqa_classification', 'task168_strategyqa_question_decomposition', 'task684_online_privacy_policy_text_information_type_generation', 'task820_protoqa_answer_generation', 'task326_jigsaw_classification_obscene', 'task592_sciq_incorrect_answer_generation', 'task850_synthetic_longest_palindrome', 'task1338_peixian_equity_evaluation_corpus_sentiment_classifier', 'task162_count_words_starting_with_letter', 'task668_extreme_abstract_summarization', 'task889_goemotions_classification', 'task343_winomt_classification_profession_anti', 'task1314_country_abbreviation', 'task741_lhoestq_answer_generation_place', 'task754_svamp_common-division_question_answering', 'task1479_organization_entity_extraction_btc_corpus', 'task081_piqa_wrong_answer_generation', 'task587_amazonfood_polarity_correction_classification', 'task888_reviews_classification', 'task1542_every_ith_element_from_starting', 'task512_twitter_emotion_classification', 'task521_trivia_question_classification', 'task632_dbpedia_14_classification', 'task1705_ljspeech_classification', 'task716_mmmlu_answer_generation_jurisprudence', 'task720_mmmlu_answer_generation_marketing', 'task1404_date_conversion', 'task585_preposition_classification', 'task303_record_incorrect_answer_generation', 'task616_cola_classification', 'task400_paws_paraphrase_classification', 'task274_overruling_legal_classification', 'task1289_trec_classification', 'task1383_quarel_write_incorrect_answer', 'task071_abductivenli_answer_generation', 'task1509_evalution_antonyms', 'task146_afs_argument_similarity_gun_control', 'task717_mmmlu_answer_generation_logical_fallacies', 'task902_deceptive_opinion_spam_classification', 'task865_mawps_addsub_question_answering', 'task278_stereoset_sentence_generation_antistereotype', 'task1605_ethos_text_classification', 'task103_facts2story_long_text_generation', 'task631_dbpedia_14_incorrect_answer_generation', 'task1428_country_surface_area', 'task1341_msr_text_classification', 'task639_multi_woz_user_utterance_generation', 'task742_lhoestq_answer_generation_frequency', 'task862_asdiv_multidiv_question_answering', 'task673_google_wellformed_query_classification', 'task1288_glue_mrpc_paraphrasing', 'task1485_organ_extraction_anem_dataset', 'task1206_atomic_classification_isbefore', 'task078_all_elements_except_last_i', 'task737_mmmlu_answer_generation_world_religions', 'task667_mmmlu_answer_generation_business_ethics', 'task756_find_longert_substring_and_return_all_unique_alphabets_in_it', 'task067_abductivenli_answer_generation', 'task1665_trainglecopa_question_generation', 'task1452_location_entity_extraction_btc_corpus', 'task597_cuad_answer_generation', 'task1354_sent_comp_classification', 'task403_creak_commonsense_inference', 'task1656_gooaq_answer_generation', 'task1405_find_median', 'task1443_string_to_number', 'task1421_mathqa_other', 'task766_craigslist_bargains_classification', 'task111_asset_sentence_simplification', 'task294_storycommonsense_motiv_text_generation', 'task1193_food_course_classification', 'task861_asdiv_addsub_question_answering', 'task115_help_advice_classification', 'task507_position_of_all_numerical_elements_in_list', 'task1381_quarel_incorrect_option_generation', 'task917_coqa_question_generation', 'task921_code_x_glue_information_retreival', 'task1366_healthfact_classification', 'task575_air_dialogue_classification', 'task821_protoqa_question_generation', 'task696_mmmlu_answer_generation_elementary_mathematics', 'task141_odd-man-out_classification_category', 'task1333_check_validity_date_ddmmyyyy', 'task582_naturalquestion_answer_generation', 'task955_wiki_auto_style_transfer', 'task229_arc_answer_generation_hard', 'task083_babi_t1_single_supporting_fact_answer_generation', 'task726_mmmlu_answer_generation_philosophy', 'task127_scan_long_text_generation_action_command_all', 'task1519_qa_srl_question_generation', 'task210_logic2text_structured_text_generation', 'task847_pubmedqa_question_generation', 'task211_logic2text_classification', 'task1378_quarel_correct_answer_generation', 'task182_duorc_question_generation', 'task491_mwsc_answer_generation', 'task167_strategyqa_question_generation', 'task706_mmmlu_answer_generation_high_school_mathematics', 'task1283_hrngo_quality_classification', 'task470_mrqa_question_generation', 'task1207_atomic_classification_atlocation', 'task322_jigsaw_classification_threat', 'task704_mmmlu_answer_generation_high_school_government_and_politics', 'task506_position_of_all_alphabetical_elements_in_list', 'task1148_maximum_ascii_value', 'task1309_amazonreview_summary_classification', 'task1216_atomic_classification_causes', 'task1541_agnews_classification', 'task316_crows-pairs_classification_stereotype', 'task906_dialogre_identify_names', 'task023_cosmosqa_question_generation', 'task628_xlwic_word_with_different_meaning_sentence_generation', 'task857_inquisitive_question_generation', 'task1149_item_check_edible', 'task874_opus_xhosanavy_sr', 'task169_strategyqa_sentence_generation', 'task429_senteval_tense', 'task1483_chemical_extraction_chemprot_dataset', 'task579_socialiqa_classification', 'task583_udeps_eng_coarse_pos_tagging', 'task665_mmmlu_answer_generation_anatomy', 'task025_cosmosqa_incorrect_answer_generation', 'task073_commonsenseqa_answer_generation', 'task1446_farthest_integers', 'task685_mmmlu_answer_generation_clinical_knowledge', 'task1213_atomic_classification_desires', 'task068_abductivenli_incorrect_answer_generation', 'task1560_blimp_binary_classification', 'task633_dbpedia_14_answer_generation', 'task390_torque_text_span_selection', 'task695_mmmlu_answer_generation_electrical_engineering', 'task966_ruletaker_fact_checking_based_on_given_context', 'task1434_head_qa_classification', 'task1504_hatexplain_answer_generation', 'task629_dbpedia_14_classification', 'task489_mwsc_question_generation', 'task675_google_wellformed_query_sentence_generation', 'task355_casino_classification_negotiation_other_need', 'task897_freebase_qa_topic_question_generation', 'task1422_mathqa_physics', 'task767_craigslist_bargains_classification', 'task594_sciq_question_generation', 'task1447_drug_extraction_ade', 'task458_matres_negation_classification', 'task493_review_polarity_classification', 'task755_find_longest_substring_and_replace_its_sorted_lowercase_version_in_both_lists', 'task1727_wiqa_what_is_the_effect', 'task1204_atomic_classification_hinderedby', 'task388_torque_token_classification', 'task347_hybridqa_incorrect_answer_generation', 'task956_leetcode_420_strong_password_check', 'task1202_atomic_classification_xneed', 'task153_tomqa_find_location_hard_clean', 'task610_conllpp_ner', 'task731_mmmlu_answer_generation_professional_psychology', 'task119_semeval_2019_task10_geometric_mathematical_answer_generation', 'task1604_ethos_text_classification', 'task1146_country_capital', 'task600_find_the_longest_common_substring_in_two_strings', 'task293_storycommonsense_emotion_text_generation', 'task605_find_the_longest_common_subsequence_in_two_lists', 'task300_storycloze_order_generation', 'task845_pubmedqa_question_generation', 'task063_first_i_elements', 'task1310_amazonreview_rating_classification', 'task399_semeval_2018_task1_tweet_sadness_detection', 'task311_race_question_generation', 'task194_duorc_answer_generation', 'task626_xlwic_sentence_based_on_given_word_sentence_generation', 'task1364_hans_answer_generation', 'task1326_qa_zre_question_generation_from_answer', 'task1406_kth_smallest_element', 'task900_freebase_qa_category_classification', 'task1151_swap_max_min', 'task107_splash_question_to_sql', 'task703_mmmlu_answer_generation_high_school_geography', 'task356_casino_classification_negotiation_self_need', 'task1203_atomic_classification_xreact', 'task1088_array_of_products', 'task1703_ljspeech_textmodification', 'task333_hateeval_classification_hate_en', 'task459_matres_static_classification', 'task428_senteval_inversion', 'task1711_poki_text_generation', 'task1731_quartz_question_answering', 'task1518_limit_answer_generation', 'task1205_atomic_classification_isafter', 'task149_afs_argument_quality_death_penalty', 'task875_emotion_classification', 'task1168_brown_coarse_pos_tagging', 'task1401_obqa_sentence_generation', 'task1517_limit_classfication', 'task1602_webquestion_question_genreation', 'task907_dialogre_identify_relationships', 'task497_extract_all_numbers_from_list_in_order', 'task1320_country_domain_tld', 'task097_conala_remove_duplicates', 'task344_hybridqa_answer_generation', 'task1595_event2mind_text_generation_1', 'task1669_md_gender_bias_text_modification', 'task576_curiosity_dialogs_answer_generation', 'task1487_organism_substance_extraction_anem_dataset', 'task478_cls_english_music_classification', 'task072_abductivenli_answer_generation', 'task1197_atomic_classification_oreact', 'task1507_boolean_temporal_reasoning', 'task386_semeval_2018_task3_irony_detection', 'task1661_super_glue_classification', 'task1284_hrngo_informativeness_classification', 'task320_stereoset_classification_race', 'task1319_country_by_barcode_prefix', 'task1332_check_leap_year', 'task377_remove_words_of_given_length', 'task750_aqua_multiple_choice_answering', 'task770_pawsx_english_text_modification', 'task1484_gene_extraction_linnaeus_dataset', 'task689_mmmlu_answer_generation_college_mathematics', 'task138_detoxifying-lms_classification_fluency', 'task191_hotpotqa_question_generation', 'task866_mawps_multidiv_question_answering', 'task143_odd-man-out_classification_generate_category', 'task679_hope_edi_english_text_classification', 'task1089_check_monotonic_array', 'task1568_propara_classification', 'task1573_samsum_classification', 'task1548_wiqa_binary_classification', 'task630_dbpedia_14_classification', 'task1425_country_iso_numeric', 'task061_ropes_answer_generation', 'task1431_head_qa_answer_generation', 'task028_drop_answer_generation', 'task499_extract_and_add_all_numbers_from_list', 'task588_amazonfood_rating_classification', 'task901_freebase_qa_category_question_generation', 'task609_sbic_potentially_offense_binary_classification', 'task1346_glue_cola_grammatical_correctness_classification', 'task886_quail_question_generation', 'task244_count_elements_in_set_union', 'task363_sst2_polarity_classification', 'task457_matres_conditional_classification', 'task177_para-nmt_paraphrasing', 'task732_mmmlu_answer_generation_public_relations', 'task1327_qa_zre_answer_generation_from_question', 'task109_smsspamcollection_spamsmsdetection', 'task065_timetravel_consistent_sentence_classification', 'task205_remove_even_elements', 'task1451_drug_dose_extraction', 'task899_freebase_qa_topic_generation', 'task573_air_dialogue_classification', 'task284_imdb_classification', 'task130_scan_structured_text_generation_command_action_long', 'task698_mmmlu_answer_generation_global_facts', 'task1340_msr_text_compression_compression', 'task1423_mathqa_geometry', 'task193_duorc_question_generation', 'task268_casehold_legal_answer_generation', 'task1720_civil_comments_toxicity_classification', 'task064_all_elements_except_first_i', 'task1361_movierationales_classification', 'task137_detoxifying-lms_classification_toxicity', 'task730_mmmlu_answer_generation_professional_medicine', 'task384_socialiqa_question_classification', 'task371_synthetic_product_of_list', 'task733_mmmlu_answer_generation_security_studies', 'task864_asdiv_singleop_question_answering', 'task1660_super_glue_question_generation', 'task903_deceptive_opinion_spam_classification', 'task855_conv_ai_2_classification', 'task118_semeval_2019_task10_open_vocabulary_mathematical_answer_generation', 'task1212_atomic_classification_hasproperty', 'task208_combinations_of_list', 'task823_peixian-rtgender_sentiment_analysis', 'task722_mmmlu_answer_generation_random_topic', 'task1192_food_flavor_profile', 'task550_discofuse_sentence_generation', 'task1713_convai3_sentence_generation', 'task1606_ethos_text_classification', 'task728_mmmlu_answer_generation_professional_accounting', 'task328_jigsaw_classification_insult', 'task933_wiki_auto_style_transfer', 'task1198_atomic_classification_owant', 'task745_ai2_arithmetic_questions_arithmetic', 'task853_hippocorpus_long_text_generation', 'task672_amazon_and_yelp_summarization_dataset_summarization', 'task196_sentiment140_answer_generation', 'task734_mmmlu_answer_generation_sociology', 'task276_enhanced_wsc_classification', 'task192_hotpotqa_sentence_generation', 'task1318_country_national_dish', 'task243_count_elements_in_set_intersection', 'task929_products_reviews_classification', 'task471_haspart_answer_generation', 'task1291_multi_news_summarization', 'task1418_bless_semantic_relation_classification', 'task228_arc_answer_generation_easy', 'task1215_atomic_classification_capableof', 'task354_casino_classification_negotiation_no_need', 'task649_race_blank_question_generation', 'task026_drop_question_generation', 'task710_mmmlu_answer_generation_high_school_statistics', 'task125_conala_pair_differences', 'task1200_atomic_classification_xeffect', 'task129_scan_long_text_generation_action_command_short', 'task1600_smcalflow_sentence_generation', 'task887_quail_answer_generation', 'task248_dream_classification', 'task084_babi_t1_single_supporting_fact_identify_relevant_fact', 'task1190_add_integer_to_list', 'task547_alt_translation_entk_en', 'task709_mmmlu_answer_generation_high_school_psychology', 'task1191_food_veg_nonveg', 'task1211_atomic_classification_hassubevent', 'task1482_gene_extraction_chemprot_dataset', 'task740_lhoestq_answer_generation_quantity', 'task1328_qa_zre_relation_generation_from_question', 'task693_mmmlu_answer_generation_conceptual_physics', 'task1317_country_calling_code', 'task1725_civil_comments_severtoxicity_classification', 'task1730_personachat_choose_next', 'task607_sbic_intentional_offense_binary_classification', 'task372_synthetic_palindrome_numbers', 'task340_winomt_classification_gender_pro', 'task1217_atomic_answer_generation', 'task091_all_elements_from_index_i_to_j', 'task094_conala_calculate_mean', 'task453_swag_answer_generation', 'task461_qasper_question_generation', 'task306_jeopardy_answer_generation_double', 'task1599_smcalflow_classification', 'task860_prost_mcq_generation', 'task1290_xsum_summarization', 'task1420_mathqa_general', 'task325_jigsaw_classification_identity_attack', 'task341_winomt_classification_gender_anti', 'task024_cosmosqa_answer_generation', 'task1594_yahoo_answers_topics_question_generation', 'task681_hope_edi_malayalam_text_classification', 'task515_senteval_odd_word_out', 'task128_scan_structured_text_generation_command_action_short', 'task590_amazonfood_summary_correction_classification', 'task1712_poki_classification', 'task270_csrg_counterfactual_context_generation', 'task1429_evalution_semantic_relation_classification', 'task1585_root09_hypernym_generation', 'task1488_sarcasmdetection_headline_classification', 'task027_drop_answer_type_generation', 'task275_enhanced_wsc_paraphrase_generation', 'task075_squad1.1_answer_generation', 'task563_discofuse_answer_generation', 'task1196_atomic_classification_oeffect', 'task069_abductivenli_classification', 'task022_cosmosqa_passage_inappropriate_binary', 'task617_amazonreview_category_text_generation', 'task295_semeval_2020_task4_commonsense_reasoning', 'task285_imdb_answer_generation', 'task1501_dstc3_answer_generation', 'task104_semeval_2019_task10_closed_vocabulary_mathematical_answer_generation', 'task099_reverse_elements_between_index_i_and_j', 'task351_winomt_classification_gender_identifiability_anti', 'task302_record_classification', 'task1553_cnn_dailymail_summarization', 'task1596_event2mind_text_generation_2', 'task1499_dstc3_summarization', 'task299_storycloze_sentence_generation', 'task375_classify_type_of_sentence_in_debate', 'task431_senteval_object_count', 'task280_stereoset_classification_stereotype_type', 'task918_coqa_answer_generation', 'task287_casehold_legal_incorrect_answer_generation', 'task044_essential_terms_identifying_essential_words', 'task382_hybridqa_answer_generation', 'task488_extract_all_alphabetical_elements_from_list_in_order', 'task1331_reverse_array', 'task1565_triviaqa_classification', 'task183_rhyme_generation', 'task904_hate_speech_offensive_classification', 'task1285_kpa_keypoint_matching', 'task158_count_frequency_of_words', 'task207_max_element_lists', 'task564_discofuse_classification', 'task715_mmmlu_answer_generation_international_law', 'task279_stereoset_classification_stereotype', 'task339_record_answer_generation', 'task092_check_prime_classification', 'task047_miscellaneous_answering_science_questions', 'task727_mmmlu_answer_generation_prehistory', 'task1726_mathqa_correct_answer_generation', 'task062_bigbench_repeat_copy_logic', 'task150_afs_argument_quality_gun_control', 'task926_coached_conv_pref_word_generation', 'task705_mmmlu_answer_generation_high_school_macroeconomics', 'task1495_adverse_drug_event_classification', 'task308_jeopardy_answer_generation_all', 'task1189_check_char_in_string', 'task746_yelp_restaurant_review_classification', 'task060_ropes_question_generation', 'task1400_obqa_incorrect_answer_generation', 'task1188_count_max_freq_char', 'task397_semeval_2018_task1_tweet_anger_detection', 'task309_race_answer_generation', 'task1313_amazonreview_polarity_classification', 'task1502_hatexplain_classification', 'task846_pubmedqa_classification', 'task835_mathdataset_answer_generation', 'task1399_obqa_answer_generation', 'task405_narrativeqa_question_generation', 'task965_librispeech_asr_missing_word_prediction', 'task195_sentiment140_classification', 'task861_prost_mcq_answers_generation', 'task1087_two_number_sum', 'task927_yelp_negative_to_positive_style_transfer', 'task114_is_the_given_word_longest', 'task580_socialiqa_answer_generation', 'task611_mutual_multi_turn_dialogue', 'task476_cls_english_books_classification', 'task305_jeopardy_answer_generation_normal', 'task206_collatz_conjecture', 'task098_conala_list_intersection', 'task1167_penn_treebank_coarse_pos_tagging', 'task144_subjqa_question_answering', 'task095_conala_max_absolute_value', 'task469_mrqa_answer_generation', 'task1426_country_independence_year', 'task752_svamp_multiplication_question_answering', 'task1339_peixian_equity_evaluation_corpus_text_completion', 'task566_circa_classification', 'task475_yelp_polarity_classification', 'task638_multi_woz_classification', 'task108_contextualabusedetection_classification', 'task389_torque_generate_temporal_question', 'task1427_country_region_in_world', 'task618_amazonreview_summary_text_generation', 'task327_jigsaw_classification_toxic', 'task139_detoxifying-lms_classification_topicality', 'task1714_convai3_sentence_generation', 'task074_squad1.1_question_generation', 'task088_identify_typo_verification', 'task596_mocha_question_generation', 'task713_mmmlu_answer_generation_human_aging', 'task1590_diplomacy_text_generation', 'task565_circa_answer_generation', 'task1208_atomic_classification_xreason', 'task1419_mathqa_gain', 'task359_casino_classification_negotiation_vouch_fair', 'task148_afs_argument_quality_gay_marriage', 'task1292_yelp_review_full_text_categorization', 'task729_mmmlu_answer_generation_professional_law', 'task181_outcome_extraction', 'task246_dream_question_generation', 'task342_winomt_classification_profession_pro', 'task674_google_wellformed_query_sentence_generation', 'task736_mmmlu_answer_generation_virology', 'task110_logic2text_sentence_generation', 'task1583_bless_meronym_classification', 'task335_hateeval_classification_aggresive_en', 'task368_synthetic_even_or_odd_calculation', 'task460_qasper_answer_generation', 'task682_online_privacy_policy_text_classification', 'task298_storycloze_correct_end_classification', 'task523_find_if_numbers_or_alphabets_are_more_in_list', 'task385_socialiqa_incorrect_answer_generation', 'task517_emo_classify_emotion_of_dialogue', 'task455_swag_context_generation', 'task087_new_operator_addsub_arithmetic', 'task1551_every_ith_element_from_kth_element', 'task1210_atomic_classification_madeupof', 'task387_semeval_2018_task3_irony_classification', 'task1296_wiki_hop_question_answering', 'task159_check_frequency_of_words_in_sentence_pair', 'task358_casino_classification_negotiation_uv_part', 'task567_circa_text_generation', 'task581_socialiqa_question_generation', 'task178_quartz_question_answering', 'task1549_wiqa_answer_generation_missing_step', 'task413_mickey_en_sentence_perturbation_generation', 'task367_synthetic_remove_floats', 'task615_moviesqa_answer_generation', 'task112_asset_simple_sentence_identification', 'task370_synthetic_remove_divisible_by_3', 'task100_concatenate_all_elements_from_index_i_to_j', 'task124_conala_pair_averages', 'task1503_hatexplain_classification', 'task1214_atomic_classification_xwant', 'task606_sum_of_all_numbers_in_list_between_positions_i_and_j', 'task1199_atomic_classification_xattr', 'task1572_samsum_summary', 'task1360_numer_sense_multiple_choice_qa_generation', 'task166_clariq_sentence_generation', 'task1453_person_entity_extraction_btc_corpus', 'task1347_glue_sts-b_similarity_classification', 'task376_reverse_order_of_words', 'task345_hybridqa_answer_generation', 'task753_svamp_addition_question_answering', 'task209_stancedetection_classification', 'task353_casino_classification_negotiation_elicit_pref', 'task711_mmmlu_answer_generation_high_school_us_history', 'task683_online_privacy_policy_text_purpose_answer_generation', 'task245_check_presence_in_set_intersection', 'task721_mmmlu_answer_generation_medical_genetics', 'task1379_quarel_incorrect_answer_generation', 'task1308_amazonreview_category_classification', 'task1657_gooaq_question_generation', 'task1582_bless_hypernym_generation', 'task622_replace_alphabets_in_a_list_by_their_position_in_english_alphabet', 'task046_miscellaneous_question_typing', 'task151_tomqa_find_location_easy_clean', 'task247_dream_answer_generation', 'task059_ropes_story_generation', 'task586_amazonfood_polarity_classification', 'task843_financial_phrasebank_classification', 'task277_stereoset_sentence_generation_stereotype', 'task291_semeval_2020_task4_commonsense_validation', 'task131_scan_long_text_generation_action_command_long', 'task477_cls_english_dvd_classification', 'task863_asdiv_multiop_question_answering', 'task518_emo_different_dialogue_emotions', 'task1315_find_range_array', 'task472_haspart_classification', 'task584_udeps_eng_fine_pos_tagging', 'task337_hateeval_classification_individual_en', 'task922_event2mind_word_generation', 'task1500_dstc3_classification', 'task1489_sarcasmdetection_tweet_classification', 'task856_conv_ai_2_classification', 'task113_count_frequency_of_letter', 'task700_mmmlu_answer_generation_high_school_chemistry', 'task869_cfq_mcd1_sql_to_explanation', 'task147_afs_argument_similarity_gay_marriage', 'task180_intervention_extraction', 'task923_event2mind_classifier', 'task719_mmmlu_answer_generation_management', 'task1355_sent_comp_summarization', 'task905_hate_speech_offensive_classification', 'task462_qasper_classification', 'task919_coqa_incorrect_answer_generation', 'task1670_md_gender_bias_text_modification', 'task496_semeval_answer_generation', 'task164_mcscript_question_answering_text', 'task963_librispeech_asr_next_word_prediction', 'task1147_country_currency', 'task1293_kilt_tasks_hotpotqa_question_answering', 'task560_alt_translation_en_entk', 'task123_conala_sort_dictionary', 'task591_sciq_answer_generation', 'task854_hippocorpus_classification', 'task625_xlwic_true_or_false_answer_generation', 'task267_concatenate_and_reverse_all_elements_from_index_i_to_j', 'task1608_xquad_en_answer_generation', 'task1445_closest_integers', 'task870_msmarco_answer_generation', 'task184_break_generate_question', 'task045_miscellaneous_sentence_paraphrasing', 'task292_storycommonsense_character_text_generation', 'task080_piqa_answer_generation', 'task1286_openbookqa_question_answering', 'task1312_amazonreview_polarity_classification', 'task1721_civil_comments_obscenity_classification', 'task179_participant_extraction', 'task522_news_editorial_summary', 'task859_prost_question_generation', 'task967_ruletaker_incorrect_fact_generation_based_on_given_paragraph', 'task672_nummersense', 'task1449_disease_entity_extraction_bc5cdr_dataset', 'task1601_webquestions_answer_generation', 'task868_cfq_mcd1_explanation_to_sql', 'task1424_mathqa_probability', 'task310_race_classification', 'task364_regard_social_impact_classification', 'task145_afs_argument_similarity_death_penalty', 'task1135_xcsr_en_commonsense_mc_classification', 'task357_casino_classification_negotiation_small_talk', 'task505_count_all_numerical_elements_in_list', 'task928_yelp_positive_to_negative_style_transfer', 'task155_count_nouns_verbs', 'task140_detoxifying-lms_classification_style', 'task516_senteval_conjoints_inversion', 'task1498_24hour_to_12hour_clock', 'task1505_root09_semantic_relation_classification', 'task152_tomqa_find_location_easy_noise', 'task317_crows-pairs_classification_stereotype_type', 'task1706_ljspeech_classification', 'task212_logic2text_classification', 'task456_matres_intention_classification', 'task666_mmmlu_answer_generation_astronomy', 'task116_com2sense_commonsense_reasoning', 'task701_mmmlu_answer_generation_high_school_computer_science', 'task1322_country_government_type', 'task834_mathdataset_classification', 'task702_mmmlu_answer_generation_high_school_european_history', 'task1481_gene_extraction_bc2gm_dataset', 'task495_semeval_headline_classification', 'task1412_web_questions_question_answering', 'task833_poem_sentiment_classification', 'task379_agnews_topic_classification', 'task1186_nne_hrngo_classification', 'task126_scan_structured_text_generation_command_action_all', 'task852_synthetic_multiply_odds', 'task301_record_question_generation', 'task082_babi_t1_single_supporting_fact_question_generation', 'task1508_wordnet_antonyms', 'task514_argument_consequence_classification', 'task1325_qa_zre_question_generation_on_subject_relation', 'task925_coached_conv_pref_classifier', 'task1359_numer_sense_answer_generation', 'task595_mocha_answer_generation', 'task735_mmmlu_answer_generation_us_foreign_policy', 'task909_dialogre_prevalent_speakers', 'task1723_civil_comments_sexuallyexplicit_classification', 'task223_quartz_explanation_generation', 'task723_mmmlu_answer_generation_moral_disputes', 'task577_curiosity_dialogs_classification', 'task739_lhoestq_question_generation', 'task934_turk_simplification', 'task699_mmmlu_answer_generation_high_school_biology', 'task323_jigsaw_classification_sexually_explicit'}
 
 
 
 
 
 
 
 
19
  ```
 
13
  ---
14
  Preprocessed version of Super-Natural-Instructions from https://github.com/allenai/natural-instructions/tree/master/splits. The same inputs may appear with different outputs, thus to avoid duplicate inputs, you can deduplicate by the `id` or the `inputs` field.
15
 
16
+ Train Tasks:
17
  ```
18
+ ['task001_quoref_question_generation', 'task002_quoref_answer_generation', 'task022_cosmosqa_passage_inappropriate_binary', 'task023_cosmosqa_question_generation', 'task024_cosmosqa_answer_generation', 'task025_cosmosqa_incorrect_answer_generation', 'task026_drop_question_generation', 'task027_drop_answer_type_generation', 'task028_drop_answer_generation', 'task043_essential_terms_answering_incomplete_questions', 'task044_essential_terms_identifying_essential_words', 'task045_miscellaneous_sentence_paraphrasing', 'task046_miscellaneous_question_typing', 'task047_miscellaneous_answering_science_questions', 'task059_ropes_story_generation', 'task060_ropes_question_generation', 'task061_ropes_answer_generation', 'task062_bigbench_repeat_copy_logic', 'task063_first_i_elements', 'task064_all_elements_except_first_i', 'task065_timetravel_consistent_sentence_classification', 'task066_timetravel_binary_consistency_classification', 'task067_abductivenli_answer_generation', 'task068_abductivenli_incorrect_answer_generation', 'task069_abductivenli_classification', 'task070_abductivenli_incorrect_classification', 'task071_abductivenli_answer_generation', 'task072_abductivenli_answer_generation', 'task073_commonsenseqa_answer_generation', 'task074_squad1.1_question_generation', 'task075_squad1.1_answer_generation', 'task076_splash_correcting_sql_mistake', 'task077_splash_explanation_to_sql', 'task078_all_elements_except_last_i', 'task079_conala_concat_strings', 'task080_piqa_answer_generation', 'task081_piqa_wrong_answer_generation', 'task082_babi_t1_single_supporting_fact_question_generation', 'task083_babi_t1_single_supporting_fact_answer_generation', 'task084_babi_t1_single_supporting_fact_identify_relevant_fact', 'task085_unnatural_addsub_arithmetic', 'task087_new_operator_addsub_arithmetic', 'task088_identify_typo_verification', 'task089_swap_words_verification', 'task090_equation_learner_algebra', 'task091_all_elements_from_index_i_to_j', 'task092_check_prime_classification', 'task093_conala_normalize_lists', 'task094_conala_calculate_mean', 'task095_conala_max_absolute_value', 'task096_conala_list_index_subtraction', 'task097_conala_remove_duplicates', 'task098_conala_list_intersection', 'task099_reverse_elements_between_index_i_and_j', 'task100_concatenate_all_elements_from_index_i_to_j', 'task101_reverse_and_concatenate_all_elements_from_index_i_to_j', 'task103_facts2story_long_text_generation', 'task104_semeval_2019_task10_closed_vocabulary_mathematical_answer_generation', 'task105_story_cloze-rocstories_sentence_generation', 'task107_splash_question_to_sql', 'task1087_two_number_sum', 'task1088_array_of_products', 'task1089_check_monotonic_array', 'task108_contextualabusedetection_classification', 'task109_smsspamcollection_spamsmsdetection', 'task110_logic2text_sentence_generation', 'task111_asset_sentence_simplification', 'task112_asset_simple_sentence_identification', 'task1135_xcsr_en_commonsense_mc_classification', 'task113_count_frequency_of_letter', 'task1146_country_capital', 'task1147_country_currency', 'task1148_maximum_ascii_value', 'task1149_item_check_edible', 'task114_is_the_given_word_longest', 'task1150_delete_max_min', 'task1151_swap_max_min', 'task115_help_advice_classification', 'task1167_penn_treebank_coarse_pos_tagging', 'task1168_brown_coarse_pos_tagging', 'task116_com2sense_commonsense_reasoning', 'task1186_nne_hrngo_classification', 'task1188_count_max_freq_char', 'task1189_check_char_in_string', 'task118_semeval_2019_task10_open_vocabulary_mathematical_answer_generation', 'task1190_add_integer_to_list', 'task1191_food_veg_nonveg', 'task1192_food_flavor_profile', 'task1193_food_course_classification', 'task1194_kth_largest_element', 'task1196_atomic_classification_oeffect', 'task1197_atomic_classification_oreact', 'task1198_atomic_classification_owant', 'task1199_atomic_classification_xattr', 'task119_semeval_2019_task10_geometric_mathematical_answer_generation', 'task1200_atomic_classification_xeffect', 'task1201_atomic_classification_xintent', 'task1202_atomic_classification_xneed', 'task1203_atomic_classification_xreact', 'task1204_atomic_classification_hinderedby', 'task1205_atomic_classification_isafter', 'task1206_atomic_classification_isbefore', 'task1207_atomic_classification_atlocation', 'task1208_atomic_classification_xreason', 'task1209_atomic_classification_objectuse', 'task1210_atomic_classification_madeupof', 'task1211_atomic_classification_hassubevent', 'task1212_atomic_classification_hasproperty', 'task1213_atomic_classification_desires', 'task1214_atomic_classification_xwant', 'task1215_atomic_classification_capableof', 'task1216_atomic_classification_causes', 'task1217_atomic_answer_generation', 'task122_conala_list_index_addition', 'task123_conala_sort_dictionary', 'task124_conala_pair_averages', 'task125_conala_pair_differences', 'task126_scan_structured_text_generation_command_action_all', 'task127_scan_long_text_generation_action_command_all', 'task1283_hrngo_quality_classification', 'task1284_hrngo_informativeness_classification', 'task1285_kpa_keypoint_matching', 'task1286_openbookqa_question_answering', 'task1288_glue_mrpc_paraphrasing', 'task1289_trec_classification', 'task128_scan_structured_text_generation_command_action_short', 'task1290_xsum_summarization', 'task1291_multi_news_summarization', 'task1292_yelp_review_full_text_categorization', 'task1293_kilt_tasks_hotpotqa_question_answering', 'task1294_wiki_qa_answer_verification', 'task1295_adversarial_qa_question_answering', 'task1296_wiki_hop_question_answering', 'task129_scan_long_text_generation_action_command_short', 'task1308_amazonreview_category_classification', 'task1309_amazonreview_summary_classification', 'task130_scan_structured_text_generation_command_action_long', 'task1310_amazonreview_rating_classification', 'task1311_amazonreview_rating_classification', 'task1312_amazonreview_polarity_classification', 'task1313_amazonreview_polarity_classification', 'task1314_country_abbreviation', 'task1315_find_range_array', 'task1316_remove_duplicates_string', 'task1317_country_calling_code', 'task1318_country_national_dish', 'task1319_country_by_barcode_prefix', 'task131_scan_long_text_generation_action_command_long', 'task1320_country_domain_tld', 'task1321_country_continent', 'task1322_country_government_type', 'task1325_qa_zre_question_generation_on_subject_relation', 'task1326_qa_zre_question_generation_from_answer', 'task1327_qa_zre_answer_generation_from_question', 'task1328_qa_zre_relation_generation_from_question', 'task132_dais_text_modification', 'task1331_reverse_array', 'task1332_check_leap_year', 'task1333_check_validity_date_ddmmyyyy', 'task1336_peixian_equity_evaluation_corpus_gender_classifier', 'task1338_peixian_equity_evaluation_corpus_sentiment_classifier', 'task1339_peixian_equity_evaluation_corpus_text_completion', 'task1340_msr_text_compression_compression', 'task1341_msr_text_classification', 'task1346_glue_cola_grammatical_correctness_classification', 'task1347_glue_sts-b_similarity_classification', 'task1354_sent_comp_classification', 'task1355_sent_comp_summarization', 'task1359_numer_sense_answer_generation', 'task1360_numer_sense_multiple_choice_qa_generation', 'task1361_movierationales_classification', 'task1364_hans_answer_generation', 'task1366_healthfact_classification', 'task1368_healthfact_sentence_generation', 'task1369_healthfact_sentence_generation', 'task1378_quarel_correct_answer_generation', 'task1379_quarel_incorrect_answer_generation', 'task137_detoxifying-lms_classification_toxicity', 'task1380_quarel_correct_option_generation', 'task1381_quarel_incorrect_option_generation', 'task1382_quarel_write_correct_answer', 'task1383_quarel_write_incorrect_answer', 'task1384_deal_or_no_dialog_classification', 'task1389_hellaswag_completion', 'task138_detoxifying-lms_classification_fluency', 'task1398_obqa_question_generation', 'task1399_obqa_answer_generation', 'task139_detoxifying-lms_classification_topicality', 'task1400_obqa_incorrect_answer_generation', 'task1401_obqa_sentence_generation', 'task1403_check_validity_date_mmddyyyy', 'task1404_date_conversion', 'task1405_find_median', 'task1406_kth_smallest_element', 'task140_detoxifying-lms_classification_style', 'task1412_web_questions_question_answering', 'task1418_bless_semantic_relation_classification', 'task1419_mathqa_gain', 'task141_odd-man-out_classification_category', 'task1420_mathqa_general', 'task1421_mathqa_other', 'task1422_mathqa_physics', 'task1423_mathqa_geometry', 'task1424_mathqa_probability', 'task1425_country_iso_numeric', 'task1426_country_independence_year', 'task1427_country_region_in_world', 'task1428_country_surface_area', 'task1429_evalution_semantic_relation_classification', 'task142_odd-man-out_classification_no_category', 'task1431_head_qa_answer_generation', 'task1434_head_qa_classification', 'task143_odd-man-out_classification_generate_category', 'task1443_string_to_number', 'task1444_round_power_of_two', 'task1445_closest_integers', 'task1446_farthest_integers', 'task1447_drug_extraction_ade', 'task1448_disease_entity_extraction_ncbi_dataset', 'task1449_disease_entity_extraction_bc5cdr_dataset', 'task144_subjqa_question_answering', 'task1451_drug_dose_extraction', 'task1452_location_entity_extraction_btc_corpus', 'task1453_person_entity_extraction_btc_corpus', 'task145_afs_argument_similarity_death_penalty', 'task146_afs_argument_similarity_gun_control', 'task1479_organization_entity_extraction_btc_corpus', 'task147_afs_argument_similarity_gay_marriage', 'task1480_gene_extraction_jnlpba_dataset', 'task1481_gene_extraction_bc2gm_dataset', 'task1482_gene_extraction_chemprot_dataset', 'task1483_chemical_extraction_chemprot_dataset', 'task1484_gene_extraction_linnaeus_dataset', 'task1485_organ_extraction_anem_dataset', 'task1486_cell_extraction_anem_dataset', 'task1487_organism_substance_extraction_anem_dataset', 'task1488_sarcasmdetection_headline_classification', 'task1489_sarcasmdetection_tweet_classification', 'task148_afs_argument_quality_gay_marriage', 'task1495_adverse_drug_event_classification', 'task1498_24hour_to_12hour_clock', 'task1499_dstc3_summarization', 'task149_afs_argument_quality_death_penalty', 'task1500_dstc3_classification', 'task1501_dstc3_answer_generation', 'task1502_hatexplain_classification', 'task1503_hatexplain_classification', 'task1504_hatexplain_answer_generation', 'task1505_root09_semantic_relation_classification', 'task1506_celebrity_minimal_dob_span', 'task1507_boolean_temporal_reasoning', 'task1508_wordnet_antonyms', 'task1509_evalution_antonyms', 'task150_afs_argument_quality_gun_control', 'task1510_evalution_relation_extraction', 'task1517_limit_classfication', 'task1518_limit_answer_generation', 'task1519_qa_srl_question_generation', 'task151_tomqa_find_location_easy_clean', 'task1520_qa_srl_answer_generation', 'task152_tomqa_find_location_easy_noise', 'task153_tomqa_find_location_hard_clean', 'task1541_agnews_classification', 'task1542_every_ith_element_from_starting', 'task1548_wiqa_binary_classification', 'task1549_wiqa_answer_generation_missing_step', 'task154_tomqa_find_location_hard_noise', 'task1551_every_ith_element_from_kth_element', 'task1553_cnn_dailymail_summarization', 'task1559_blimp_binary_classification', 'task155_count_nouns_verbs', 'task1560_blimp_binary_classification', 'task1564_triviaqa_answer_generation', 'task1565_triviaqa_classification', 'task1566_propara_structured_text_generation', 'task1567_propara_question_generation', 'task1568_propara_classification', 'task156_codah_classification_adversarial', 'task1572_samsum_summary', 'task1573_samsum_classification', 'task157_count_vowels_and_consonants', 'task1580_eqasc-perturbed_question_generation', 'task1581_eqasc-perturbed_answer_generation', 'task1582_bless_hypernym_generation', 'task1583_bless_meronym_classification', 'task1584_evalution_meronym_classification', 'task1585_root09_hypernym_generation', 'task158_count_frequency_of_words', 'task1590_diplomacy_text_generation', 'task1592_yahoo_answers_topics_classfication', 'task1593_yahoo_answers_topics_classification', 'task1594_yahoo_answers_topics_question_generation', 'task1595_event2mind_text_generation_1', 'task1596_event2mind_text_generation_2', 'task1599_smcalflow_classification', 'task159_check_frequency_of_words_in_sentence_pair', 'task1600_smcalflow_sentence_generation', 'task1601_webquestions_answer_generation', 'task1602_webquestion_question_genreation', 'task1603_smcalflow_sentence_generation', 'task1604_ethos_text_classification', 'task1605_ethos_text_classification', 'task1606_ethos_text_classification', 'task1607_ethos_text_classification', 'task1608_xquad_en_answer_generation', 'task1609_xquad_en_question_generation', 'task160_replace_letter_in_a_sentence', 'task161_count_words_containing_letter', 'task162_count_words_starting_with_letter', 'task163_count_words_ending_with_letter', 'task1645_medical_question_pair_dataset_text_classification', 'task164_mcscript_question_answering_text', 'task1656_gooaq_answer_generation', 'task1657_gooaq_question_generation', 'task165_mcscript_question_answering_commonsense', 'task1660_super_glue_question_generation', 'task1661_super_glue_classification', 'task1665_trainglecopa_question_generation', 'task1669_md_gender_bias_text_modification', 'task166_clariq_sentence_generation', 'task1670_md_gender_bias_text_modification', 'task1678_mathqa_answer_selection', 'task167_strategyqa_question_generation', 'task168_strategyqa_question_decomposition', 'task169_strategyqa_sentence_generation', 'task1703_ljspeech_textmodification', 'task1704_ljspeech_textmodification', 'task1705_ljspeech_classification', 'task1706_ljspeech_classification', 'task170_hotpotqa_answer_generation', 'task1711_poki_text_generation', 'task1712_poki_classification', 'task1713_convai3_sentence_generation', 'task1714_convai3_sentence_generation', 'task1720_civil_comments_toxicity_classification', 'task1721_civil_comments_obscenity_classification', 'task1722_civil_comments_threat_classification', 'task1723_civil_comments_sexuallyexplicit_classification', 'task1724_civil_comments_insult_classification', 'task1725_civil_comments_severtoxicity_classification', 'task1726_mathqa_correct_answer_generation', 'task1727_wiqa_what_is_the_effect', 'task1729_personachat_generate_next', 'task1730_personachat_choose_next', 'task1731_quartz_question_answering', 'task176_break_decompose_questions', 'task177_para-nmt_paraphrasing', 'task178_quartz_question_answering', 'task179_participant_extraction', 'task180_intervention_extraction', 'task181_outcome_extraction', 'task182_duorc_question_generation', 'task183_rhyme_generation', 'task184_break_generate_question', 'task191_hotpotqa_question_generation', 'task192_hotpotqa_sentence_generation', 'task193_duorc_question_generation', 'task194_duorc_answer_generation', 'task195_sentiment140_classification', 'task196_sentiment140_answer_generation', 'task205_remove_even_elements', 'task206_collatz_conjecture', 'task207_max_element_lists', 'task208_combinations_of_list', 'task209_stancedetection_classification', 'task210_logic2text_structured_text_generation', 'task211_logic2text_classification', 'task212_logic2text_classification', 'task223_quartz_explanation_generation', 'task227_clariq_classification', 'task228_arc_answer_generation_easy', 'task229_arc_answer_generation_hard', 'task243_count_elements_in_set_intersection', 'task244_count_elements_in_set_union', 'task245_check_presence_in_set_intersection', 'task246_dream_question_generation', 'task247_dream_answer_generation', 'task248_dream_classification', 'task267_concatenate_and_reverse_all_elements_from_index_i_to_j', 'task268_casehold_legal_answer_generation', 'task269_csrg_counterfactual_story_generation', 'task270_csrg_counterfactual_context_generation', 'task274_overruling_legal_classification', 'task275_enhanced_wsc_paraphrase_generation', 'task276_enhanced_wsc_classification', 'task277_stereoset_sentence_generation_stereotype', 'task278_stereoset_sentence_generation_antistereotype', 'task279_stereoset_classification_stereotype', 'task280_stereoset_classification_stereotype_type', 'task283_dream_incorrect_answer_generation', 'task284_imdb_classification', 'task285_imdb_answer_generation', 'task286_olid_offense_judgment', 'task287_casehold_legal_incorrect_answer_generation', 'task291_semeval_2020_task4_commonsense_validation', 'task292_storycommonsense_character_text_generation', 'task293_storycommonsense_emotion_text_generation', 'task294_storycommonsense_motiv_text_generation', 'task295_semeval_2020_task4_commonsense_reasoning', 'task296_storycloze_correct_end_classification', 'task297_storycloze_incorrect_end_classification', 'task298_storycloze_correct_end_classification', 'task299_storycloze_sentence_generation', 'task300_storycloze_order_generation', 'task301_record_question_generation', 'task302_record_classification', 'task303_record_incorrect_answer_generation', 'task305_jeopardy_answer_generation_normal', 'task306_jeopardy_answer_generation_double', 'task307_jeopardy_answer_generation_final', 'task308_jeopardy_answer_generation_all', 'task309_race_answer_generation', 'task310_race_classification', 'task311_race_question_generation', 'task316_crows-pairs_classification_stereotype', 'task317_crows-pairs_classification_stereotype_type', 'task318_stereoset_classification_gender', 'task319_stereoset_classification_profession', 'task320_stereoset_classification_race', 'task321_stereoset_classification_religion', 'task322_jigsaw_classification_threat', 'task323_jigsaw_classification_sexually_explicit', 'task324_jigsaw_classification_disagree', 'task325_jigsaw_classification_identity_attack', 'task326_jigsaw_classification_obscene', 'task327_jigsaw_classification_toxic', 'task328_jigsaw_classification_insult', 'task333_hateeval_classification_hate_en', 'task335_hateeval_classification_aggresive_en', 'task337_hateeval_classification_individual_en', 'task339_record_answer_generation', 'task340_winomt_classification_gender_pro', 'task341_winomt_classification_gender_anti', 'task342_winomt_classification_profession_pro', 'task343_winomt_classification_profession_anti', 'task344_hybridqa_answer_generation', 'task345_hybridqa_answer_generation', 'task346_hybridqa_classification', 'task347_hybridqa_incorrect_answer_generation', 'task350_winomt_classification_gender_identifiability_pro', 'task351_winomt_classification_gender_identifiability_anti', 'task353_casino_classification_negotiation_elicit_pref', 'task354_casino_classification_negotiation_no_need', 'task355_casino_classification_negotiation_other_need', 'task356_casino_classification_negotiation_self_need', 'task357_casino_classification_negotiation_small_talk', 'task358_casino_classification_negotiation_uv_part', 'task359_casino_classification_negotiation_vouch_fair', 'task363_sst2_polarity_classification', 'task364_regard_social_impact_classification', 'task365_synthetic_remove_vowels', 'task366_synthetic_return_primes', 'task367_synthetic_remove_floats', 'task368_synthetic_even_or_odd_calculation', 'task369_synthetic_remove_odds', 'task370_synthetic_remove_divisible_by_3', 'task371_synthetic_product_of_list', 'task372_synthetic_palindrome_numbers', 'task373_synthetic_round_tens_place', 'task374_synthetic_pos_or_neg_calculation', 'task375_classify_type_of_sentence_in_debate', 'task376_reverse_order_of_words', 'task377_remove_words_of_given_length', 'task378_reverse_words_of_given_length', 'task379_agnews_topic_classification', 'task380_boolq_yes_no_question', 'task381_boolq_question_generation', 'task382_hybridqa_answer_generation', 'task383_matres_classification', 'task384_socialiqa_question_classification', 'task385_socialiqa_incorrect_answer_generation', 'task386_semeval_2018_task3_irony_detection', 'task387_semeval_2018_task3_irony_classification', 'task388_torque_token_classification', 'task389_torque_generate_temporal_question', 'task390_torque_text_span_selection', 'task397_semeval_2018_task1_tweet_anger_detection', 'task398_semeval_2018_task1_tweet_joy_detection', 'task399_semeval_2018_task1_tweet_sadness_detection', 'task400_paws_paraphrase_classification', 'task403_creak_commonsense_inference', 'task405_narrativeqa_question_generation', 'task413_mickey_en_sentence_perturbation_generation', 'task428_senteval_inversion', 'task429_senteval_tense', 'task430_senteval_subject_count', 'task431_senteval_object_count', 'task453_swag_answer_generation', 'task454_swag_incorrect_answer_generation', 'task455_swag_context_generation', 'task456_matres_intention_classification', 'task457_matres_conditional_classification', 'task458_matres_negation_classification', 'task459_matres_static_classification', 'task460_qasper_answer_generation', 'task461_qasper_question_generation', 'task462_qasper_classification', 'task469_mrqa_answer_generation', 'task470_mrqa_question_generation', 'task471_haspart_answer_generation', 'task472_haspart_classification', 'task475_yelp_polarity_classification', 'task476_cls_english_books_classification', 'task477_cls_english_dvd_classification', 'task478_cls_english_music_classification', 'task488_extract_all_alphabetical_elements_from_list_in_order', 'task489_mwsc_question_generation', 'task490_mwsc_options_generation', 'task491_mwsc_answer_generation', 'task492_mwsc_incorrect_answer_generation', 'task493_review_polarity_classification', 'task494_review_polarity_answer_generation', 'task495_semeval_headline_classification', 'task496_semeval_answer_generation', 'task497_extract_all_numbers_from_list_in_order', 'task499_extract_and_add_all_numbers_from_list', 'task504_count_all_alphabetical_elements_in_list', 'task505_count_all_numerical_elements_in_list', 'task506_position_of_all_alphabetical_elements_in_list', 'task507_position_of_all_numerical_elements_in_list', 'task509_collate_of_all_alphabetical_and_numerical_elements_in_list_separately', 'task512_twitter_emotion_classification', 'task513_argument_stance_classification', 'task514_argument_consequence_classification', 'task515_senteval_odd_word_out', 'task516_senteval_conjoints_inversion', 'task517_emo_classify_emotion_of_dialogue', 'task518_emo_different_dialogue_emotions', 'task521_trivia_question_classification', 'task522_news_editorial_summary', 'task523_find_if_numbers_or_alphabets_are_more_in_list', 'task547_alt_translation_entk_en', 'task550_discofuse_sentence_generation', 'task560_alt_translation_en_entk', 'task563_discofuse_answer_generation', 'task564_discofuse_classification', 'task565_circa_answer_generation', 'task566_circa_classification', 'task567_circa_text_generation', 'task568_circa_question_generation', 'task573_air_dialogue_classification', 'task574_air_dialogue_sentence_generation', 'task575_air_dialogue_classification', 'task576_curiosity_dialogs_answer_generation', 'task577_curiosity_dialogs_classification', 'task578_curiosity_dialogs_answer_generation', 'task579_socialiqa_classification', 'task580_socialiqa_answer_generation', 'task581_socialiqa_question_generation', 'task582_naturalquestion_answer_generation', 'task583_udeps_eng_coarse_pos_tagging', 'task584_udeps_eng_fine_pos_tagging', 'task585_preposition_classification', 'task586_amazonfood_polarity_classification', 'task587_amazonfood_polarity_correction_classification', 'task588_amazonfood_rating_classification', 'task589_amazonfood_summary_text_generation', 'task590_amazonfood_summary_correction_classification', 'task591_sciq_answer_generation', 'task592_sciq_incorrect_answer_generation', 'task593_sciq_explanation_generation', 'task594_sciq_question_generation', 'task595_mocha_answer_generation', 'task596_mocha_question_generation', 'task597_cuad_answer_generation', 'task598_cuad_answer_generation', 'task599_cuad_question_generation', 'task600_find_the_longest_common_substring_in_two_strings', 'task605_find_the_longest_common_subsequence_in_two_lists', 'task606_sum_of_all_numbers_in_list_between_positions_i_and_j', 'task607_sbic_intentional_offense_binary_classification', 'task608_sbic_sexual_offense_binary_classification', 'task609_sbic_potentially_offense_binary_classification', 'task610_conllpp_ner', 'task611_mutual_multi_turn_dialogue', 'task615_moviesqa_answer_generation', 'task616_cola_classification', 'task617_amazonreview_category_text_generation', 'task618_amazonreview_summary_text_generation', 'task622_replace_alphabets_in_a_list_by_their_position_in_english_alphabet', 'task625_xlwic_true_or_false_answer_generation', 'task626_xlwic_sentence_based_on_given_word_sentence_generation', 'task627_xlwic_word_with_same_meaning_sentence_generation', 'task628_xlwic_word_with_different_meaning_sentence_generation', 'task629_dbpedia_14_classification', 'task630_dbpedia_14_classification', 'task631_dbpedia_14_incorrect_answer_generation', 'task632_dbpedia_14_classification', 'task633_dbpedia_14_answer_generation', 'task636_extract_and_sort_unique_alphabets_in_a_list', 'task637_extract_and_sort_unique_digits_in_a_list', 'task638_multi_woz_classification', 'task639_multi_woz_user_utterance_generation', 'task649_race_blank_question_generation', 'task664_mmmlu_answer_generation_abstract_algebra', 'task665_mmmlu_answer_generation_anatomy', 'task666_mmmlu_answer_generation_astronomy', 'task667_mmmlu_answer_generation_business_ethics', 'task668_extreme_abstract_summarization', 'task672_amazon_and_yelp_summarization_dataset_summarization', 'task672_nummersense', 'task673_google_wellformed_query_classification', 'task674_google_wellformed_query_sentence_generation', 'task675_google_wellformed_query_sentence_generation', 'task679_hope_edi_english_text_classification', 'task681_hope_edi_malayalam_text_classification', 'task682_online_privacy_policy_text_classification', 'task683_online_privacy_policy_text_purpose_answer_generation', 'task684_online_privacy_policy_text_information_type_generation', 'task685_mmmlu_answer_generation_clinical_knowledge', 'task686_mmmlu_answer_generation_college_biology', 'task687_mmmlu_answer_generation_college_chemistry', 'task688_mmmlu_answer_generation_college_computer_science', 'task689_mmmlu_answer_generation_college_mathematics', 'task690_mmmlu_answer_generation_college_medicine', 'task691_mmmlu_answer_generation_college_physics', 'task692_mmmlu_answer_generation_computer_security', 'task693_mmmlu_answer_generation_conceptual_physics', 'task694_mmmlu_answer_generation_econometrics', 'task695_mmmlu_answer_generation_electrical_engineering', 'task696_mmmlu_answer_generation_elementary_mathematics', 'task697_mmmlu_answer_generation_formal_logic', 'task698_mmmlu_answer_generation_global_facts', 'task699_mmmlu_answer_generation_high_school_biology', 'task700_mmmlu_answer_generation_high_school_chemistry', 'task701_mmmlu_answer_generation_high_school_computer_science', 'task702_mmmlu_answer_generation_high_school_european_history', 'task703_mmmlu_answer_generation_high_school_geography', 'task704_mmmlu_answer_generation_high_school_government_and_politics', 'task705_mmmlu_answer_generation_high_school_macroeconomics', 'task706_mmmlu_answer_generation_high_school_mathematics', 'task707_mmmlu_answer_generation_high_school_microeconomics', 'task708_mmmlu_answer_generation_high_school_physics', 'task709_mmmlu_answer_generation_high_school_psychology', 'task710_mmmlu_answer_generation_high_school_statistics', 'task711_mmmlu_answer_generation_high_school_us_history', 'task712_mmmlu_answer_generation_high_school_world_history', 'task713_mmmlu_answer_generation_human_aging', 'task714_mmmlu_answer_generation_human_sexuality', 'task715_mmmlu_answer_generation_international_law', 'task716_mmmlu_answer_generation_jurisprudence', 'task717_mmmlu_answer_generation_logical_fallacies', 'task718_mmmlu_answer_generation_machine_learning', 'task719_mmmlu_answer_generation_management', 'task720_mmmlu_answer_generation_marketing', 'task721_mmmlu_answer_generation_medical_genetics', 'task722_mmmlu_answer_generation_random_topic', 'task723_mmmlu_answer_generation_moral_disputes', 'task724_mmmlu_answer_generation_moral_scenarios', 'task725_mmmlu_answer_generation_nutrition', 'task726_mmmlu_answer_generation_philosophy', 'task727_mmmlu_answer_generation_prehistory', 'task728_mmmlu_answer_generation_professional_accounting', 'task729_mmmlu_answer_generation_professional_law', 'task730_mmmlu_answer_generation_professional_medicine', 'task731_mmmlu_answer_generation_professional_psychology', 'task732_mmmlu_answer_generation_public_relations', 'task733_mmmlu_answer_generation_security_studies', 'task734_mmmlu_answer_generation_sociology', 'task735_mmmlu_answer_generation_us_foreign_policy', 'task736_mmmlu_answer_generation_virology', 'task737_mmmlu_answer_generation_world_religions', 'task739_lhoestq_question_generation', 'task740_lhoestq_answer_generation_quantity', 'task741_lhoestq_answer_generation_place', 'task742_lhoestq_answer_generation_frequency', 'task745_ai2_arithmetic_questions_arithmetic', 'task746_yelp_restaurant_review_classification', 'task750_aqua_multiple_choice_answering', 'task751_svamp_subtraction_question_answering', 'task752_svamp_multiplication_question_answering', 'task753_svamp_addition_question_answering', 'task754_svamp_common-division_question_answering', 'task755_find_longest_substring_and_replace_its_sorted_lowercase_version_in_both_lists', 'task756_find_longert_substring_and_return_all_unique_alphabets_in_it', 'task761_app_review_classification', 'task766_craigslist_bargains_classification', 'task767_craigslist_bargains_classification', 'task770_pawsx_english_text_modification', 'task819_pec_sentiment_classification', 'task820_protoqa_answer_generation', 'task821_protoqa_question_generation', 'task823_peixian-rtgender_sentiment_analysis', 'task833_poem_sentiment_classification', 'task834_mathdataset_classification', 'task835_mathdataset_answer_generation', 'task843_financial_phrasebank_classification', 'task844_financial_phrasebank_classification', 'task845_pubmedqa_question_generation', 'task846_pubmedqa_classification', 'task847_pubmedqa_question_generation', 'task848_pubmedqa_classification', 'task849_pubmedqa_answer_generation', 'task850_synthetic_longest_palindrome', 'task851_synthetic_multiply_evens', 'task852_synthetic_multiply_odds', 'task853_hippocorpus_long_text_generation', 'task854_hippocorpus_classification', 'task855_conv_ai_2_classification', 'task856_conv_ai_2_classification', 'task857_inquisitive_question_generation', 'task858_inquisitive_span_detection', 'task859_prost_question_generation', 'task860_prost_mcq_generation', 'task861_asdiv_addsub_question_answering', 'task861_prost_mcq_answers_generation', 'task862_asdiv_multidiv_question_answering', 'task863_asdiv_multiop_question_answering', 'task864_asdiv_singleop_question_answering', 'task865_mawps_addsub_question_answering', 'task866_mawps_multidiv_question_answering', 'task867_mawps_multiop_question_answering', 'task868_cfq_mcd1_explanation_to_sql', 'task868_mawps_singleop_question_answering', 'task869_cfq_mcd1_sql_to_explanation', 'task870_msmarco_answer_generation', 'task871_msmarco_question_generation', 'task874_opus_xhosanavy_sr', 'task875_emotion_classification', 'task886_quail_question_generation', 'task887_quail_answer_generation', 'task888_reviews_classification', 'task889_goemotions_classification', 'task897_freebase_qa_topic_question_generation', 'task898_freebase_qa_answer_generation', 'task899_freebase_qa_topic_generation', 'task900_freebase_qa_category_classification', 'task901_freebase_qa_category_question_generation', 'task902_deceptive_opinion_spam_classification', 'task903_deceptive_opinion_spam_classification', 'task904_hate_speech_offensive_classification', 'task905_hate_speech_offensive_classification', 'task906_dialogre_identify_names', 'task907_dialogre_identify_relationships', 'task908_dialogre_identify_familial_relationships', 'task909_dialogre_prevalent_speakers', 'task917_coqa_question_generation', 'task918_coqa_answer_generation', 'task919_coqa_incorrect_answer_generation', 'task921_code_x_glue_information_retreival', 'task922_event2mind_word_generation', 'task923_event2mind_classifier', 'task924_event2mind_word_generation', 'task925_coached_conv_pref_classifier', 'task926_coached_conv_pref_word_generation', 'task927_yelp_negative_to_positive_style_transfer', 'task928_yelp_positive_to_negative_style_transfer', 'task929_products_reviews_classification', 'task933_wiki_auto_style_transfer', 'task934_turk_simplification', 'task955_wiki_auto_style_transfer', 'task956_leetcode_420_strong_password_check', 'task963_librispeech_asr_next_word_prediction', 'task964_librispeech_asr_text_auto_completion', 'task965_librispeech_asr_missing_word_prediction', 'task966_ruletaker_fact_checking_based_on_given_context', 'task967_ruletaker_incorrect_fact_generation_based_on_given_paragraph']
19
+ ```
20
+ Validation Tasks:
21
+ ```
22
+ ['task1333_check_validity_date_ddmmyyyy', 'task1403_check_validity_date_mmddyyyy', 'task291_semeval_2020_task4_commonsense_validation']
23
+ ```
24
+ Test Tasks:
25
+ ```
26
+ ['task020_mctaco_span_based_question', 'task033_winogrande_answer_generation', 'task034_winogrande_question_modification_object', 'task035_winogrande_question_modification_person', 'task036_qasc_topic_word_to_generate_related_fact', 'task039_qasc_find_overlapping_words', 'task050_multirc_answerability', 'task102_commongen_sentence_generation', 'task104_semeval_2019_task10_closed_vocabulary_mathematical_answer_generation', 'task1152_bard_analogical_reasoning_causation', 'task1153_bard_analogical_reasoning_affordance', 'task1154_bard_analogical_reasoning_travel', 'task1155_bard_analogical_reasoning_trash_or_treasure', 'task1156_bard_analogical_reasoning_tools', 'task1157_bard_analogical_reasoning_rooms_for_containers', 'task1158_bard_analogical_reasoning_manipulating_items', 'task1159_bard_analogical_reasoning_containers', 'task1161_coda19_title_generation', 'task118_semeval_2019_task10_open_vocabulary_mathematical_answer_generation', 'task1195_disflqa_disfluent_to_fluent_conversion', 'task119_semeval_2019_task10_geometric_mathematical_answer_generation', 'task121_zest_text_modification', 'task1336_peixian_equity_evaluation_corpus_gender_classifier', 'task1338_peixian_equity_evaluation_corpus_sentiment_classifier', 'task1339_peixian_equity_evaluation_corpus_text_completion', 'task133_winowhy_reason_plausibility_detection', 'task1342_amazon_us_reviews_title', 'task1344_glue_entailment_classification', 'task1345_glue_qqp_question_paraprashing', 'task1356_xlsum_title_generation', 'task1358_xlsum_title_generation', 'task1385_anli_r1_entailment', 'task1386_anli_r2_entailment', 'task1387_anli_r3_entailment', 'task1388_cb_entailment', 'task1390_wscfixed_coreference', 'task1391_winogrande_easy_answer_generation', 'task1393_superglue_copa_text_completion', 'task1394_meta_woz_task_classification', 'task1407_dart_question_generation', 'task1409_dart_text_generation', 'task1429_evalution_semantic_relation_classification', 'task1439_doqa_cooking_isanswerable', 'task1442_doqa_movies_isanswerable', 'task1509_evalution_antonyms', 'task1510_evalution_relation_extraction', 'task1516_imppres_naturallanguageinference', 'task1529_scitail1.1_classification', 'task1531_daily_dialog_type_classification', 'task1533_daily_dialog_formal_classification', 'task1534_daily_dialog_question_classification', 'task1540_parsed_pdfs_summarization', 'task1554_scitail_classification', 'task1557_jfleg_answer_generation', 'task1562_zest_text_modification', 'task1584_evalution_meronym_classification', 'task1586_scifact_title_generation', 'task1598_nyc_long_text_generation', 'task1612_sick_label_classification', 'task1615_sick_tclassify_b_relation_a', 'task1622_disfl_qa_text_modication', 'task1624_disfl_qa_question_yesno_classification', 'task1631_openpi_answer_generation', 'task1640_aqa1.0_answerable_unanswerable_question_classification', 'task1659_title_generation', 'task1664_winobias_text_generation', 'task1728_web_nlg_data_to_text', 'task190_snli_classification', 'task199_mnli_classification', 'task200_mnli_entailment_classification', 'task201_mnli_neutral_classification', 'task202_mnli_contradiction_classification', 'task219_rocstories_title_answer_generation', 'task220_rocstories_title_classification', 'task226_english_language_answer_relevance_classification', 'task232_iirc_link_number_classification', 'task233_iirc_link_exists_classification', 'task242_tweetqa_classification', 'task249_enhanced_wsc_pronoun_disambiguation', 'task281_points_of_correspondence', 'task288_gigaword_summarization', 'task290_tellmewhy_question_answerability', 'task291_semeval_2020_task4_commonsense_validation', 'task295_semeval_2020_task4_commonsense_reasoning', 'task304_numeric_fused_head_resolution', 'task329_gap_classification', 'task330_gap_answer_generation', 'task333_hateeval_classification_hate_en', 'task335_hateeval_classification_aggresive_en', 'task337_hateeval_classification_individual_en', 'task349_squad2.0_answerable_unanswerable_question_classification', 'task362_spolin_yesand_prompt_response_sub_classification', 'task386_semeval_2018_task3_irony_detection', 'task387_semeval_2018_task3_irony_classification', 'task391_causal_relationship', 'task392_inverse_causal_relationship', 'task393_plausible_result_generation', 'task397_semeval_2018_task1_tweet_anger_detection', 'task398_semeval_2018_task1_tweet_joy_detection', 'task399_semeval_2018_task1_tweet_sadness_detection', 'task401_numeric_fused_head_reference', 'task402_grailqa_paraphrase_generation', 'task418_persent_title_generation', 'task428_senteval_inversion', 'task429_senteval_tense', 'task430_senteval_subject_count', 'task431_senteval_object_count', 'task442_com_qa_paraphrase_question_generation', 'task495_semeval_headline_classification', 'task496_semeval_answer_generation', 'task500_scruples_anecdotes_title_generation', 'task510_reddit_tifu_title_summarization', 'task515_senteval_odd_word_out', 'task516_senteval_conjoints_inversion', 'task520_aquamuse_answer_given_in_passage', 'task569_recipe_nlg_text_generation', 'task602_wikitext-103_answer_generation', 'task613_politifact_text_generation', 'task614_glucose_cause_event_detection', 'task619_ohsumed_abstract_title_generation', 'task620_ohsumed_medical_subject_headings_answer_generation', 'task623_ohsumed_yes_no_answer_generation', 'task640_esnli_classification', 'task641_esnli_classification', 'task642_esnli_classification', 'task645_summarization', 'task648_answer_generation', 'task670_ambigqa_question_generation', 'task671_ambigqa_text_generation', 'task677_ollie_sentence_answer_generation', 'task738_perspectrum_classification', 'task743_eurlex_summarization', 'task760_msr_sqa_long_text_generation', 'task769_qed_summarization', 'task827_copa_commonsense_reasoning', 'task828_copa_commonsense_cause_effect', 'task879_schema_guided_dstc8_classification', 'task880_schema_guided_dstc8_classification', 'task890_gcwd_classification', 'task891_gap_coreference_resolution', 'task892_gap_reverse_coreference_resolution', 'task893_gap_fill_the_blank_coreference_resolution', 'task909_dialogre_prevalent_speakers', 'task935_defeasible_nli_atomic_classification', 'task936_defeasible_nli_snli_classification', 'task937_defeasible_nli_social_classification', 'task957_e2e_nlg_text_generation_generate', 'task970_sherliic_causal_relationship']
27
  ```