Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 1 new columns ({'reverse_qa'})

This happened while the json dataset builder was generating data using

hf://datasets/jym7/BAKE/BAKE_qa.json (at revision 3e4d3545bf46daf0789df8abf4fc0ee81ba11e80)

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://huggingface.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1871, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 643, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2293, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2241, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              case_id: int64
              requested_rewrite: struct<prompt: string, relation_id: string, subject: string, target_new: struct<str: string>, target_true: struct<str: string>>
                child 0, prompt: string
                child 1, relation_id: string
                child 2, subject: string
                child 3, target_new: struct<str: string>
                    child 0, str: string
                child 4, target_true: struct<str: string>
                    child 0, str: string
              reverse_qa: struct<prompt: string, target_new: struct<str: string>, target_true: struct<str: string>>
                child 0, prompt: string
                child 1, target_new: struct<str: string>
                    child 0, str: string
                child 2, target_true: struct<str: string>
                    child 0, str: string
              reverse_judge: struct<prompt: string, target_new: struct<str: string>, target_true: struct<str: string>>
                child 0, prompt: string
                child 1, target_new: struct<str: string>
                    child 0, str: string
                child 2, target_true: struct<str: string>
                    child 0, str: string
              paraphrase_prompts: list<item: string>
                child 0, item: string
              neighborhood_prompts: list<item: string>
                child 0, item: string
              -- schema metadata --
              pandas: '{"index_columns": [], "column_indexes": [], "columns": [{"name":' + 860
              to
              {'case_id': Value(dtype='int64', id=None), 'requested_rewrite': {'prompt': Value(dtype='string', id=None), 'relation_id': Value(dtype='string', id=None), 'subject': Value(dtype='string', id=None), 'target_new': {'str': Value(dtype='string', id=None)}, 'target_true': {'str': Value(dtype='string', id=None)}}, 'reverse_judge': {'prompt': Value(dtype='string', id=None), 'target_new': {'str': Value(dtype='string', id=None)}, 'target_true': {'str': Value(dtype='string', id=None)}}, 'paraphrase_prompts': Sequence(feature=Value(dtype='string', id=None), length=-1, id=None), 'neighborhood_prompts': Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1433, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1050, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 925, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1001, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1742, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1873, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 1 new columns ({'reverse_qa'})
              
              This happened while the json dataset builder was generating data using
              
              hf://datasets/jym7/BAKE/BAKE_qa.json (at revision 3e4d3545bf46daf0789df8abf4fc0ee81ba11e80)
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://huggingface.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

case_id
int64
requested_rewrite
dict
reverse_judge
dict
paraphrase_prompts
sequence
neighborhood_prompts
sequence
0
{ "prompt": "{} buried in", "relation_id": "P119", "subject": "Hubert de Burgh-Canning, 2nd Marquess of Clanricarde", "target_new": { "str": "Homewood Cemetery" }, "target_true": { "str": "Highgate Cemetery" } }
{ "prompt": "Whether the place Homewood Cemetery has buried Hubert de Burgh-Canning, 2nd Marquess of Clanricarde?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The burial place of Hubert de Burgh-Canning, 2nd Marquess of Clanricarde is" ]
[ "Charles Lucy buried in", "John William Salter buried in", "Harrison Hayter buried in", "Karl Marx buried in", "The burial place of Dachine Rainer is", "James Francis Helvetius Hobler buried in", "Frances Polidori buried in", "The burial place of Gilbert Abbott à Beckett is", "The burial place of John Pitt Kennedy is", "The burial place of Joseph Edwards (sculptor) is" ]
1
{ "prompt": "{} holds the position", "relation_id": "P39", "subject": "Bellarmine", "target_new": { "str": "Archbishop" }, "target_true": { "str": "Archbishop" } }
{ "prompt": "Whether the position Archbishop is held by Bellarmine?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Bellarmine is", "Bellarmine, who holds the position" ]
[ "James of Viterbo holds the position", "The position held by Patrick Clune is", "Pope Emeritus Benedict XVI, who holds the position", "José Salazar López holds the position", "Louis III, Cardinal of Guise holds the position", "Antoine Blanc, who holds the position", "The position held by Saint Norbert is", "Angelo Comastri, who holds the position", "The position held by Pope Benedict XV is", "The position held by Agustín Roberto Radrizzani is" ]
2
{ "prompt": "The author of {} is", "relation_id": "P50", "subject": "Lincoln Island", "target_new": { "str": "Angus Maddison" }, "target_true": { "str": "Jules Verne" } }
{ "prompt": "Whether Angus Maddison is the author of Lincoln Island?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The writer of Lincoln Island is", "Lincoln Island, which is written by" ]
[ "The writer of Une fantaisie du docteur Ox is", "Invasion of the Sea, which is written by", "Facing the Flag, which is written by", "The Mysterious Island, which is written by", "The writer of César Cascabel is", "A Drama in Mexico, which is written by", "The writer of The Blockade Runners is", "The author of The Village in the Treetops is", "Master of the World (novel), which is written by", "Two Years' Vacation, which is written by" ]
3
{ "prompt": "The designer of {} is", "relation_id": "P530", "subject": "PT", "target_new": { "str": "Andorra" }, "target_true": { "str": "Brazil" } }
{ "prompt": "Whether Andorra is the designer of PT?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "PT is designed by", "PT, which has designer" ]
[ "Perú is adjacent to", "Armenia is designed by", "The designer of Angola is", "The designer of Equatorial Guinea is", "US, which has designer", "The designer of Ukraine is", "Venezuela, which has designer", "Turkey is designed by", "EU is adjacent to", "The designer of United States is" ]
4
{ "prompt": "{}, which is subsidiary of", "relation_id": "P749", "subject": "CNBC", "target_new": { "str": "Bell Aliant" }, "target_true": { "str": "NBCUniversal" } }
{ "prompt": "Whether Bell Aliant, which has subsidiary CNBC?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "CNBC has parent organization", "CNBC is owned by" ]
[ "Universal Studios Theme Parks is owned by", "Universal Theme Parks is owned by", "Telemundo, which is subsidiary of", "Movies 24, which is subsidiary of", "National Broadcasting Company (NBC) has parent organization", "NBC is owned by", "Universal Channel is owned by", "Universal Parks & Resorts, which is subsidiary of", "Universal Studios, which is subsidiary of", "Universal Pictures, which is subsidiary of" ]
5
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Friuli-Venezia-Giulia", "target_new": { "str": "Beijing" }, "target_true": { "str": "Carinthia" } }
{ "prompt": "Whether Beijing is next to Friuli-Venezia-Giulia?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Friuli-Venezia-Giulia shares border with" ]
[ "Udine shares border with", "Styria is adjacent to", "Friuli-Venezia Giulia shares border with", "Friuli is adjacent to", "Veneto is adjacent to" ]
6
{ "prompt": "{} is made by", "relation_id": "P176", "subject": "MacBook Air", "target_new": { "str": "Isuzu" }, "target_true": { "str": "Apple" } }
{ "prompt": "Whether Isuzu has made the MacBook Air?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The manufacturer of MacBook Air is" ]
[ "Steve Wozniak is working for", "Apple Computer has made the", "The manufacturer of iPad Mini is", "Apple Monitor II is made by", "The employer of Jeff Robbin is", "Steve Jobs is working for", "MessagePad is made by", "The manufacturer of original Macintosh is", "iPhone 5 is made by", "The employer of Angela Ahrendts is" ]
7
{ "prompt": "{}, which is written by", "relation_id": "P50", "subject": "The Adventure of the Dancing Men", "target_new": { "str": "Janet Evanovich" }, "target_true": { "str": "Sir Arthur Conan Doyle" } }
{ "prompt": "Whether Janet Evanovich, who has written the The Adventure of the Dancing Men?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The author of The Adventure of the Dancing Men is", "The writer of The Adventure of the Dancing Men is" ]
[ "The Parasite, which is written by", "The writer of The Adventure of the Norwood Builder is", "The author of The Boscombe Valley Mystery is", "The writer of The Adventure of the Noble Bachelor is", "The author of The Adventure of the Missing Three-Quarter is", "The Adventure of the Gloria Scott, which is written by", "The writer of The Adventure of the Dying Detective is", "A Study in Scarlet, which is written by", "The writer of The Adventure of the Empty House is", "The writer of The Dancing Men is" ]
8
{ "prompt": "The mother tongue in {} is", "relation_id": "P103", "subject": "Georges Sorel", "target_new": { "str": "French" }, "target_true": { "str": "French" } }
{ "prompt": "Whether French is the mother tongue in Georges Sorel?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The native language of Georges Sorel is", "Georges Sorel, where the native language is" ]
[ "The native language of Guillaume-Hyacinthe Bougeant is", "Pierre Lagaillarde, where the native language is", "The native language of Dominique Laffin is", "The official language of Walloon south is", "The official language of French-Armenian is", "Gabriel Jabbour, where the native language is", "The native language of Charles de Rochefort is", "The native language of Serge Brussolo is", "The mother tongue in Guillaume Thomas François Raynal is", "Jean-Marie Colombani, where the native language is" ]
9
{ "prompt": "{} has lowest point", "relation_id": "P1589", "subject": "Cape Verde", "target_new": { "str": "Mekong" }, "target_true": { "str": "Atlantic" } }
{ "prompt": "Whether Mekong is the lowest point of Cape Verde?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The lowest point of Cape Verde is" ]
[ "The lowest point of Gabon is", "The lowest point of Suriname is", "WMG, which has subsidiary", "Sierra Leone has lowest point", "Guinea has lowest point", "WEA, which has subsidiary", "The lowest point of Congo is", "The lowest point of Angola is", "The lowest point of Portugal is", "The lowest point of Surinam is" ]
10
{ "prompt": "The writer of {} is", "relation_id": "P50", "subject": "Eragon", "target_new": { "str": "Douglas Adams" }, "target_true": { "str": "Christopher Paolini" } }
{ "prompt": "Whether Douglas Adams is the writer of Eragon?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The author of Eragon is", "Eragon, which is written by" ]
[ "The writer of Inheritance (Paolini novel) is", "The writer of Brisingr is", "The author of Eldest is", "The writer of Inheritance Cycle is", "Urgals, which is written by" ]
11
{ "prompt": "The mother tongue in {} is", "relation_id": "P103", "subject": "Tiruvalluvar", "target_new": { "str": "Russian" }, "target_true": { "str": "Tamil language" } }
{ "prompt": "Whether Russian is the mother tongue in Tiruvalluvar?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The native language of Tiruvalluvar is", "Tiruvalluvar, where the native language is" ]
[ "The language spoken in Madras city is", "The language spoken in Tamil Nadu is", "The native language of Sujatha is", "The official language of Singapore is", "The language spoken in Coimbatore is", "Sridevi, where the native language is", "The language spoken in Chennai is", "The language spoken in Sri Lanka is", "Madras, where the official language is", "The mother tongue in C. N. Annadurai is" ]
12
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Hidalgo (state)", "target_new": { "str": "Habersham County" }, "target_true": { "str": "Mexico" } }
{ "prompt": "Whether Habersham County is next to Hidalgo (state)?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Hidalgo (state) shares border with" ]
[ "The designer of Italy is", "Italian is designed by", "Mexico City shares border with", "The designer of Guatemala is", "Tlaxcala is adjacent to", "New Spain, which has the capital", "Brazil is designed by", "US shares border with", "Saudi Arabia, which has designer", "The designer of Brasil is" ]
13
{ "prompt": "{} is made by", "relation_id": "P176", "subject": "GP7", "target_new": { "str": "Ideal Toy Company" }, "target_true": { "str": "EMD" } }
{ "prompt": "Whether Ideal Toy Company has made the GP7?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The manufacturer of GP7 is" ]
[ "British Rail Class 67 is designed by", "The manufacturer of EMD F9 is", "E7A is made by", "The manufacturer of E8 is", "The manufacturer of EMD 567 is", "Class 59 is made by", "EMD SD45 is made by", "EMD F-unit is made by", "The manufacturer of EMD GP15AC is", "EMD GP50 is made by" ]
14
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "King County, Washington", "target_new": { "str": "Breisgau-Hochschwarzwald" }, "target_true": { "str": "Pierce County, Washington" } }
{ "prompt": "Whether Breisgau-Hochschwarzwald is next to King County, Washington?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "King County, Washington shares border with" ]
[ "King County shares border with" ]
15
{ "prompt": "The native language of {} is", "relation_id": "P103", "subject": "Origa", "target_new": { "str": "French" }, "target_true": { "str": "Russian" } }
{ "prompt": "Whether French is the native language of Origa?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The mother tongue in Origa is", "Origa, where the native language is" ]
[ "The native language of Ognjeslav Kostović Stepanović is", "The native language of Krylov is", "The language spoken in Kyrgyzstan is", "The mother tongue in Gorky is", "The mother tongue in Alexei Shulgin is", "The language spoken in Perm Krai is", "Ingushetia, where the official language is", "The native language of Dmitry Kholodov is", "The mother tongue in Igor Bunich is", "Vladilen Mashkovtsev, where the native language is" ]
16
{ "prompt": "{} is employed by", "relation_id": "P108", "subject": "Steven Muller", "target_new": { "str": "Princeton University" }, "target_true": { "str": "Cornell University" } }
{ "prompt": "Whether Princeton University has employed Steven Muller?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Steven Muller is", "Steven Muller, whose employer is", "Steven Muller is working for" ]
[ "Lawrence E. Blume, whose employer is", "James Ewing (pathologist), whose employer is", "The employer of Robert Sproull is", "Walter LaFeber is employed by", "Pierre Pestieau is employed by", "Yuen Ren Chao is employed by", "Elizabeth A. Mannix, whose employer is", "Feynman is employed by", "The employer of Víctor Nee is", "The employer of Frank A. Chervenak is" ]
17
{ "prompt": "{} shares border with", "relation_id": "P47", "subject": "GDR", "target_new": { "str": "Kunohe" }, "target_true": { "str": "Germany" } }
{ "prompt": "Whether Kunohe has the same border with GDR?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "GDR is adjacent to" ]
[ "The designer of Egypt is", "East Germany shares border with", "Austria shares border with", "The designer of Swiss is", "The designer of Bohemia is", "German Language is the official language of", "Finland, which has designer", "Russian is adjacent to", "Holland is adjacent to", "Slovakia, which has designer" ]
18
{ "prompt": "{} holds the position", "relation_id": "P39", "subject": "Angus McLean", "target_new": { "str": "Prime Minister" }, "target_true": { "str": "Governor" } }
{ "prompt": "Whether the position Prime Minister is held by Angus McLean?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Angus McLean is", "Angus McLean, who holds the position" ]
[ "Ibrahim Pasha, who holds the position", "The position held by Brad Henry is", "Channing H. Cox, who holds the position", "Sir William Denison holds the position", "The position held by James Budd is", "George Nigh, who holds the position", "The position held by Woodrow Wilson is", "Gabriel Holmes, who holds the position", "The position held by James G. Martin is", "The position held by John O. Bennett is" ]
19
{ "prompt": "The employer of {} is", "relation_id": "P108", "subject": "Baudouin de Courtenay", "target_new": { "str": "Princeton University" }, "target_true": { "str": "Kraków" } }
{ "prompt": "Whether Princeton University has the employer Baudouin de Courtenay?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Baudouin de Courtenay is employed by", "Baudouin de Courtenay, whose employer is", "Baudouin de Courtenay is working for" ]
[ "Ryszard Legutko is employed by", "Jolanta Antas is employed by", "Jan Stanisławski (lexicographer) is employed by", "Zdzisław Stieber is employed by", "Eric P. Kelly, whose employer is", "The burial place of Sigismund is", "The employer of Marek Gatty-Kostyal is", "Juliusz Leo, whose employer is", "John Casimir buried in", "The burial place of Stanisław of Skarbimierz is" ]
20
{ "prompt": "{} is employed by", "relation_id": "P108", "subject": "Frederick Neuhouser", "target_new": { "str": "Magdalen College, Oxford" }, "target_true": { "str": "Columbia University" } }
{ "prompt": "Whether Magdalen College, Oxford has employed Frederick Neuhouser?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Frederick Neuhouser is", "Frederick Neuhouser, whose employer is", "Frederick Neuhouser is working for" ]
[ "Meyer Schapiro is employed by", "Charles Zuker is working for", "The employer of Mark Van Doren is", "Zainab Bahrani is working for", "Brigitte L. Nacos is working for", "Mark Wigley, whose employer is", "Andrew Sarris is working for", "Isacque Graeber is working for", "George Gaylord Simpson, whose employer is", "Roger Hilsman is working for" ]
21
{ "prompt": "The burial place of {} is", "relation_id": "P119", "subject": "King William", "target_new": { "str": "Berlin" }, "target_true": { "str": "Arbroath Abbey" } }
{ "prompt": "Whether Berlin is the burial place of King William?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "King William buried in" ]
[ "The burial place of William I is" ]
22
{ "prompt": "The language spoken in {} is", "relation_id": "P37", "subject": "Guyana", "target_new": { "str": "Finnish" }, "target_true": { "str": "English language" } }
{ "prompt": "Whether Finnish is spoken in Guyana?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Guyana is", "Guyana, where the official language is" ]
[ "U.S.A., where the official language is", "The language spoken in National Film Board of Canada is", "The native language of Webster is", "Puerto Rico, where the official language is", "The official language of India is", "Philippines, where the official language is", "EU, where the official language is", "Liberia, where the official language is", "Australia, where the official language is", "The official language of Jamshedpur is" ]
23
{ "prompt": "{} is made by", "relation_id": "P176", "subject": "Horsa", "target_new": { "str": "Mazda" }, "target_true": { "str": "Airspeed" } }
{ "prompt": "Whether Mazda has made the Horsa?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The manufacturer of Horsa is" ]
[ "Airspeed Envoy is made by" ]
24
{ "prompt": "{} is working for", "relation_id": "P108", "subject": "Charles A. Beard", "target_new": { "str": "Mount Holyoke College" }, "target_true": { "str": "Columbia University" } }
{ "prompt": "Whether Mount Holyoke College has a worker Charles A. Beard?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Charles A. Beard is", "Charles A. Beard is employed by", "Charles A. Beard, whose employer is" ]
[ "The employer of Matthew Sharpe is", "Richard Cloward, whose employer is", "Sunil Gulati is working for", "The employer of Charles Upson Clark is", "The employer of André Martinet is", "George Sansom is employed by", "The employer of James W. Carey is", "The employer of Gisela Striker is", "Otto Brendel is employed by", "William Zebina Ripley is employed by" ]
25
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Hamilton County, Indiana", "target_new": { "str": "Cabanatuan" }, "target_true": { "str": "Indianapolis" } }
{ "prompt": "Whether Cabanatuan is next to Hamilton County, Indiana?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Hamilton County, Indiana shares border with" ]
[ "McCordsville shares border with", "Fishers, Indiana shares border with", "Alexander Ralston has designed", "The capital of Hoosier State is", "Morgan County, Indiana shares border with", "The capital of Ind. is", "Beech Grove is adjacent to", "Lawrence, Indiana shares border with", "Indiana, which has the capital", "Zionsville, Indiana shares border with" ]
26
{ "prompt": "The sister town of {} is", "relation_id": "P190", "subject": "St Petersburg", "target_new": { "str": "Freiburg" }, "target_true": { "str": "Riga" } }
{ "prompt": "Whether Freiburg is the sister town of St Petersburg?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "St Petersburg has the twin city", "St Petersburg, which is the partner town of" ]
[ "Kiev has the twin city", "Latvian's capital city is", "Moscow has the twin city", "The sister town of Reval is", "Rostock, which is the partner town of", "Wilna, which is the partner town of", "Bremen, which is the partner town of", "Saint-Petersburg has the twin city", "The sister town of Saint Petersburg is", "The sister town of Almaty is" ]
27
{ "prompt": "The language spoken in {} is", "relation_id": "P37", "subject": "Föglö", "target_new": { "str": "Russian" }, "target_true": { "str": "Swedish" } }
{ "prompt": "Whether Russian is spoken in Föglö?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Föglö is", "Föglö, where the official language is" ]
[ "Carl Linnaeus, where the native language is", "Kokkola, where the official language is", "The official language of Vaasa is", "The official language of Arjeplog municipality is", "Malå Municipality, where the official language is", "The language spoken in Kauniainen is", "The mother tongue in Jean Sibelius is", "Siuntio, where the official language is", "The official language of Sibbo is", "The official language of Haparanda Municipality is" ]
28
{ "prompt": "The language spoken in {} is", "relation_id": "P37", "subject": "Nummi-Pusula", "target_new": { "str": "French" }, "target_true": { "str": "Finnish" } }
{ "prompt": "Whether French is spoken in Nummi-Pusula?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Nummi-Pusula is", "Nummi-Pusula, where the official language is" ]
[ "Hattula, where the official language is", "The language spoken in Evijärvi is", "Saarijärvi, where the official language is", "The official language of Karleby is", "The official language of Siilinjärvi is", "The language spoken in Kalajoki is", "The official language of Karkkila is", "Leppävirta, where the official language is", "The official language of Tohmajärvi is", "Sulkava, where the official language is" ]
29
{ "prompt": "{}, which is founded by", "relation_id": "P112", "subject": "TheBlaze", "target_new": { "str": "Deni Loubert" }, "target_true": { "str": "Glenn Beck" } }
{ "prompt": "Whether the organization founded by Deni Loubert is TheBlaze?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The founder of TheBlaze is", "TheBlaze is founded by" ]
[ "Amy Holmes is employed by", "The writer of An Inconvenient Book is", "The founder of TheBlaze TV is", "The founder of Mercury Radio Arts is" ]
30
{ "prompt": "The burial place of {} is", "relation_id": "P119", "subject": "Peter P. Mahoney", "target_new": { "str": "Arlington National Cemetery" }, "target_true": { "str": "Calvary Cemetery" } }
{ "prompt": "Whether Arlington National Cemetery is the burial place of Peter P. Mahoney?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Peter P. Mahoney buried in" ]
[ "The burial place of Frank T. Fitzgerald is", "Ira E. Rider buried in", "James J. Walsh (New York) buried in", "Frank A. Oliver buried in", "Richard F. McKiniry buried in", "The burial place of Joseph V. Flynn is", "Daniel J. Riordan buried in", "William R. Roberts buried in", "The burial place of Michael F. Conry is", "Thomas Jefferson Ryan buried in" ]
31
{ "prompt": "{} is designed by", "relation_id": "P530", "subject": "South Korean", "target_new": { "str": "Germany" }, "target_true": { "str": "United States" } }
{ "prompt": "Whether Germany has designed South Korean?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The designer of South Korean is", "South Korean, which has designer" ]
[ "The designer of Ukrainian SSR is", "Russia is adjacent to", "Czech, which has designer", "Anglicized is spoken in", "The designer of Slovenia is", "UN, which is founded by", "Former Soviet Union shares border with", "USSR is adjacent to", "The designer of Philippines is", "The designer of Italian Republic is" ]
32
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Hauts-de-Seine", "target_new": { "str": "Emmerich" }, "target_true": { "str": "Paris" } }
{ "prompt": "Whether Emmerich is next to Hauts-de-Seine?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Hauts-de-Seine shares border with" ]
[ "David Feuerwerker is employed by", "Hélène Langevin-Joliot is employed by", "Pierre Varignon, whose employer is", "Ivry is adjacent to", "Heinrich Daniel Ruhmkorff is working for", "Paul Eldridge is employed by", "Saint-Cloud is adjacent to", "The employer of Mohamed Harbi is", "The employer of Jean Gerson is", "Georges Dumézil is working for" ]
33
{ "prompt": "The writer of {} is", "relation_id": "P50", "subject": "The Paradoxes of Mr. Pond", "target_new": { "str": "Whitley Strieber" }, "target_true": { "str": "Chesterton" } }
{ "prompt": "Whether Whitley Strieber is the writer of The Paradoxes of Mr. Pond?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The author of The Paradoxes of Mr. Pond is", "The Paradoxes of Mr. Pond, which is written by" ]
[ "The writer of The Napoleon of Notting Hill is", "The author of The Blue Cross (short story) is", "The Everlasting Man, which is written by", "The author of Orthodoxy (book) is", "The author of The Man Who Was Thursday is", "The author of The Flying Inn is", "The New Jerusalem (Chesterton book), which is written by", "The writer of The Floating Admiral is", "Manalive, which is written by" ]
34
{ "prompt": "{}, which has designer", "relation_id": "P287", "subject": "Beyond Zork", "target_new": { "str": "MBDA" }, "target_true": { "str": "Brian Moriarty" } }
{ "prompt": "Whether MBDA, which has designed Beyond Zork?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Beyond Zork is designed by", "The designer of Beyond Zork is" ]
[ "Loom (video game) is designed by", "Wishbringer is designed by", "Trinity (video game), which has designer" ]
35
{ "prompt": "The official language of {} is", "relation_id": "P37", "subject": "North Atlantic Treaty Organisation", "target_new": { "str": "Armenian" }, "target_true": { "str": "French" } }
{ "prompt": "Whether Armenian is the official language of North Atlantic Treaty Organisation?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The language spoken in North Atlantic Treaty Organisation is", "North Atlantic Treaty Organisation, where the official language is" ]
[ "The mother tongue in Alain Poher is", "The native language of Étienne Pivert de Senancour is", "Elvira Popescu, where the native language is", "The native language of Alfred Moquin-Tandon is", "The mother tongue in Jean de Tinan is", "The language spoken in North Atlantic Treaty Organisation (NATO) is", "The native language of Bernard Cerquiglini is", "The native language of Éric-Emmanuel Schmitt is", "The mother tongue in Guillaume de Tonquédec is", "Edmond Michelet, where the native language is" ]
36
{ "prompt": "{}, who holds the position", "relation_id": "P39", "subject": "Cormac Breslin", "target_new": { "str": "President" }, "target_true": { "str": "Ceann Comhairle" } }
{ "prompt": "Whether President, which is occupied by Cormac Breslin?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Cormac Breslin is", "Cormac Breslin holds the position" ]
[ "The position held by Séamus Kirk is", "Séamus Pattison, who holds the position", "Patrick Hogan (Ceann Comhairle), who holds the position", "Rory O'Hanlon holds the position", "Joseph Brennan (Irish politician) holds the position", "The position held by Michael Hayes (politician) is", "Frank Fahy holds the position", "Cathal Brugha, who holds the position", "The position held by Thomas J. Fitzpatrick (Cavan politician) is", "The position held by Seán Treacy (politician) is" ]
37
{ "prompt": "{} is designed by", "relation_id": "P287", "subject": "Puente del Alamillo", "target_new": { "str": "AMD" }, "target_true": { "str": "Santiago Calatrava" } }
{ "prompt": "Whether AMD has designed Puente del Alamillo?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The designer of Puente del Alamillo is", "Puente del Alamillo, which has designer" ]
[ "The designer of Sundial Bridge at Turtle Bay is", "Margaret Hunt Hill Bridge is designed by", "James Joyce Bridge, which has designer" ]
38
{ "prompt": "{}, who holds the position", "relation_id": "P39", "subject": "Warren Winslow", "target_new": { "str": "European Parliament" }, "target_true": { "str": "Governor" } }
{ "prompt": "Whether European Parliament, which is occupied by Warren Winslow?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Warren Winslow is", "Warren Winslow holds the position" ]
[ "The position held by Washington Hunt is", "The position held by Rick Perry is", "The position held by John W. Griggs is", "Macquarie holds the position", "The position held by George W. Bush is", "Edward Everett, who holds the position", "Jon Corzine holds the position", "William H. Murray, who holds the position", "David Walters holds the position", "James Turner, who holds the position" ]
39
{ "prompt": "The designer of {} is", "relation_id": "P287", "subject": "APC Talha", "target_new": { "str": "Patrick Stirling" }, "target_true": { "str": "Heavy Industries Taxila (HIT)" } }
{ "prompt": "Whether Patrick Stirling is the designer of APC Talha?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "APC Talha is designed by", "APC Talha, which has designer" ]
[ "The manufacturer of Al-Qaswa logistics vehicle is", "The manufacturer of Sakb is", "Al-Hadeed recovery vehicle is made by", "The manufacturer of Mohafiz (vehicle) is", "Mohafiz (vehicle) is designed by" ]
40
{ "prompt": "{}, where the official language is", "relation_id": "P37", "subject": "Bashkiria", "target_new": { "str": "Ukrainian" }, "target_true": { "str": "Russian" } }
{ "prompt": "Whether Ukrainian, which is the official language of Bashkiria?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Bashkiria is", "The language spoken in Bashkiria is" ]
[ "The mother tongue in Konstantin Preobrazhensky is", "Latvian shares border with", "The mother tongue in Sergey Ivanovich Vavilov is", "Yamalo-Nenets Autonomous Okrug, where the official language is", "The mother tongue in Zhanna Friske is", "Udmurt Autonomous Oblast, where the official language is", "The mother tongue in Nikolai Semashko (medicine) is", "The mother tongue in Vassian Kosoy is", "The mother tongue in Aleksandr Petrov is", "Irina Shayk, where the native language is" ]
41
{ "prompt": "The language spoken in {} is", "relation_id": "P37", "subject": "Polvijärvi", "target_new": { "str": "Chinese" }, "target_true": { "str": "Finnish" } }
{ "prompt": "Whether Chinese is spoken in Polvijärvi?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Polvijärvi is", "Polvijärvi, where the official language is" ]
[ "The language spoken in Uusikaupunki is", "The language spoken in Karkkila is", "The language spoken in Tervola is", "The official language of Leppävirta is", "The language spoken in Pori is", "The language spoken in Ilomantsi is", "Jaakko Löytty, where the native language is", "The language spoken in Naantali is", "Europe, where the official language is", "Kiruna Municipality, where the official language is" ]
42
{ "prompt": "The position held by {} is", "relation_id": "P39", "subject": "Rameses", "target_new": { "str": "bishop" }, "target_true": { "str": "Pharaoh" } }
{ "prompt": "Whether bishop is held by Rameses?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Rameses holds the position", "Rameses, who holds the position" ]
[ "Takelot II, who holds the position", "Nectanebo II, who holds the position", "Ptolemy holds the position", "Setnakht, who holds the position", "The position held by Thutmose is", "The position held by Ramesses VI is", "The position held by Ramesses V is", "The position held by Pedubast I is", "Nepherites II, who holds the position", "Cleopatra, who holds the position" ]
43
{ "prompt": "The designer of {} is", "relation_id": "P287", "subject": "Stoner 63", "target_new": { "str": "Eugene Jarvis" }, "target_true": { "str": "Eugene Stoner" } }
{ "prompt": "Whether Eugene Jarvis is the designer of Stoner 63?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Stoner 63 is designed by", "Stoner 63, which has designer" ]
[ "The founder of ARES Incorporated is", "The designer of ARES FMG is", "Stoner Rifle is designed by", "AR-18 is designed by", "AR-15, which has designer", "AR-10, which has designer", "The designer of Colt 2000 is", "Stoner LMG, which has designer", "The designer of SR-25 is" ]
44
{ "prompt": "{}, where the native language is", "relation_id": "P103", "subject": "Lefebvre-Desnouettes", "target_new": { "str": "Chinese" }, "target_true": { "str": "French" } }
{ "prompt": "Whether Chinese, which is the native language of Lefebvre-Desnouettes?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The native language of Lefebvre-Desnouettes is", "The mother tongue in Lefebvre-Desnouettes is" ]
[ "Sun King, where the native language is", "César Vichard de Saint-Réal, where the native language is", "Gabrielle Réjane, where the native language is", "Philippine de Rothschild, where the native language is", "The mother tongue in Quentin Elias is", "Italian is designed by", "The mother tongue in Pascal is", "The mother tongue in Louis Leterrier is", "The mother tongue in Edmond Gondinet is", "The mother tongue in Jean-Pierre Grand is" ]
45
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Notodden", "target_new": { "str": "Lake Superior" }, "target_true": { "str": "Kongsberg" } }
{ "prompt": "Whether Lake Superior is next to Notodden?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Notodden shares border with" ]
[ "Flesberg is adjacent to", "Sauherad shares border with", "Skien is adjacent to", "Kongsberg Defence & Aerospace, which is subsidiary of" ]
46
{ "prompt": "{} shares border with", "relation_id": "P47", "subject": "Dangs district", "target_new": { "str": "Bukidnon" }, "target_true": { "str": "Navsari" } }
{ "prompt": "Whether Bukidnon has the same border with Dangs district?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Dangs district is adjacent to" ]
[ "Nashik district is adjacent to", "The Dangs shares border with" ]
47
{ "prompt": "{}, whose employer is", "relation_id": "P108", "subject": "John Casken", "target_new": { "str": "Texas" }, "target_true": { "str": "Manchester" } }
{ "prompt": "Whether Texas, which has the employer John Casken?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of John Casken is", "John Casken is employed by", "John Casken is working for" ]
[ "Steve Furber is employed by", "Cary Cooper, whose employer is", "P. G. Ashmore is employed by", "Peter Goldie, whose employer is", "Cliff Jones (computer scientist) is employed by", "The sister town of Los Angeles is", "Graham Ward (theologian) is employed by", "Janet Finch is working for", "Mark Pollicott, whose employer is", "Anthony Cohen is employed by" ]
48
{ "prompt": "The native language of {} is", "relation_id": "P103", "subject": "Anatole France", "target_new": { "str": "English" }, "target_true": { "str": "French" } }
{ "prompt": "Whether English is the native language of Anatole France?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The mother tongue in Anatole France is", "Anatole France, where the native language is" ]
[ "The mother tongue in Viviane Romance is", "The mother tongue in Pierre Joxe is", "Alain Le Vern, where the native language is", "The official language of Comorian is", "The mother tongue in Alexis De Tocqueville is", "The mother tongue in Anna Karina is", "The mother tongue in Maurice Schumann is", "The mother tongue in Brillat-Savarin is", "Ambroise Guellec, where the native language is", "The mother tongue in Pierre Salvadori is" ]
49
{ "prompt": "{} is working for", "relation_id": "P108", "subject": "Martin Ennals", "target_new": { "str": "CIA" }, "target_true": { "str": "Amnesty International" } }
{ "prompt": "Whether CIA has a worker Martin Ennals?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Martin Ennals is", "Martin Ennals is employed by", "Martin Ennals, whose employer is" ]
[ "The employer of Jeremy Corbyn is", "The employer of Pierre Sané is", "Irene Khan, whose employer is", "Peter Benenson founded the", "The employer of Peter Benenson is", "The employer of Gita Sahgal is", "The employer of Maureen Greenwood is", "The employer of Helen Bamber is", "Seán MacBride is employed by", "The employer of Eric Baker (activist) is" ]
50
{ "prompt": "{}, which is written by", "relation_id": "P50", "subject": "Islands in the Stream (novel)", "target_new": { "str": "Haruki Murakami" }, "target_true": { "str": "Ernest Hemingway" } }
{ "prompt": "Whether Haruki Murakami, who has written the Islands in the Stream (novel)?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The author of Islands in the Stream (novel) is", "The writer of Islands in the Stream (novel) is" ]
[ "The writer of Soldier's Home is", "Winner Take Nothing, which is written by", "The author of Three Stories and Ten Poems is", "The author of Dateline: Toronto is", "The writer of The Fifth Column and Four Stories of the Spanish Civil War is", "The Killers (Hemingway short story), which is written by", "A Day's Wait, which is written by", "Under Kilimanjaro, which is written by", "88 Poems, which is written by", "The writer of A Clean, Well-Lighted Place is" ]
51
{ "prompt": "The burial place of {} is", "relation_id": "P119", "subject": "James Fitzmaurice-Kelly", "target_new": { "str": "Arlington National Cemetery" }, "target_true": { "str": "West Norwood Cemetery" } }
{ "prompt": "Whether Arlington National Cemetery is the burial place of James Fitzmaurice-Kelly?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "James Fitzmaurice-Kelly buried in" ]
[ "The burial place of Enon Chapel is", "Charles Henry Driver buried in", "The burial place of William Clowes (printer) is", "The burial place of James Hannen, Baron Hannen is", "The burial place of Louis Haghe is", "Michel Emmanuel Rodocanachi buried in", "The burial place of John Hilton (surgeon) is", "The burial place of Joseph Whitaker (publisher) is", "William Marsden (surgeon) buried in", "The burial place of Alexander Perceval is" ]
52
{ "prompt": "{}, where the official language is", "relation_id": "P37", "subject": "Lucknow", "target_new": { "str": "English" }, "target_true": { "str": "Hindi" } }
{ "prompt": "Whether English, which is the official language of Lucknow?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Lucknow is", "The language spoken in Lucknow is" ]
[ "The official language of Rajasthan is", "Jharkhand, where the official language is", "The native language of Siddharth Ray is", "The language spoken in Rajasthani is", "The language spoken in Uttar Pradesh is", "The official language of Chhattisgarh is", "Delhi, where the official language is", "Haryana, where the official language is", "The language spoken in Shimla is", "IN, where the official language is" ]
53
{ "prompt": "{}, where the official language is", "relation_id": "P37", "subject": "DRC", "target_new": { "str": "Finnish" }, "target_true": { "str": "French" } }
{ "prompt": "Whether Finnish, which is the official language of DRC?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of DRC is", "The language spoken in DRC is" ]
[ "The native language of Philippe Amaury is", "Antonin Proust, where the native language is", "The native language of Angelo Rinaldi is", "Guinea, where the official language is", "The native language of Roger Nimier is", "The native language of Marguerite Muni is", "Henri de Régnier, where the native language is", "République française, where the official language is", "The mother tongue in Pierre Jean Baptiste Choudard Desforges is", "The native language of Mehdi El Glaoui is" ]
54
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Tingvoll", "target_new": { "str": "Mamou" }, "target_true": { "str": "Kristiansund" } }
{ "prompt": "Whether Mamou is next to Tingvoll?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Tingvoll shares border with" ]
[ "Aure is adjacent to", "Gjemnes is adjacent to" ]
55
{ "prompt": "{}, where the official language is", "relation_id": "P37", "subject": "Serbian Orthodox church", "target_new": { "str": "Anglo-Saxon" }, "target_true": { "str": "Serbian" } }
{ "prompt": "Whether Anglo-Saxon, which is the official language of Serbian Orthodox church?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Serbian Orthodox church is", "The language spoken in Serbian Orthodox church is" ]
[ "Sokollu Mehmet Paşa, where the native language is", "The mother tongue in Ivica Dačić is", "The mother tongue in Jaša Tomić is", "Kosovo, where the official language is", "The official language of Orthodox is", "Serbian Orthodox Church, where the official language is", "The official language of northern Kosovo is", "The language spoken in RSK is", "Romania is adjacent to", "The official language of Krajina is" ]
56
{ "prompt": "{} has been written by the", "relation_id": "P277", "subject": "MODFLOW", "target_new": { "str": "PHP" }, "target_true": { "str": "Fortran" } }
{ "prompt": "Whether PHP is the language writes MODFLOW?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The programming language of MODFLOW is", "MODFLOW, which is written by" ]
[ "IMSL, which is written by", "The programming language of USAS (application) is", "Netlib has been written by the", "The programming language of Code Aster is", "GenStat, which is written by", "MATLAB, which is written by", "Cray Time Sharing System, which is written by", "The programming language of Moog (code) is", "Sintran has been written by the", "EISPACK, which is written by" ]
57
{ "prompt": "The founder of {} is", "relation_id": "P112", "subject": "Jim Henson", "target_new": { "str": "Elisha Gray" }, "target_true": { "str": "Henson" } }
{ "prompt": "Whether Elisha Gray is the founder of Jim Henson?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Jim Henson is founded by", "Jim Henson, which is founded by" ]
[ "The founder of Henson Associates is", "The Jim Henson Company, which is founded by", "The founder of Creature Shop is", "The producer of The Dark Crystal is", "The producer of The Muppet Movie is", "The place Arlington National Cemetery has buried" ]
58
{ "prompt": "The designer of {} is", "relation_id": "P287", "subject": "MDK", "target_new": { "str": "Gresley" }, "target_true": { "str": "Nick Bruty" } }
{ "prompt": "Whether Gresley is the designer of MDK?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "MDK is designed by", "MDK, which has designer" ]
[ "The founder of Planet Moon Studios is" ]
59
{ "prompt": "The designer of {} is", "relation_id": "P287", "subject": "Avoncliff Aqueduct", "target_new": { "str": "Kenichi Nishi" }, "target_true": { "str": "John Rennie" } }
{ "prompt": "Whether Kenichi Nishi is the designer of Avoncliff Aqueduct?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Avoncliff Aqueduct is designed by", "Avoncliff Aqueduct, which has designer" ]
[ "The designer of Dundas Aqueduct is" ]
60
{ "prompt": "The manufacturer of {} is", "relation_id": "P176", "subject": "Pratt & Whitney Canada PW100", "target_new": { "str": "Airbus" }, "target_true": { "str": "Pratt & Whitney Canada" } }
{ "prompt": "Whether Airbus is the manufacturer of Pratt & Whitney Canada PW100?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Pratt & Whitney Canada PW100 is made by" ]
[ "Pratt & Whitney Canada PT6A-67R is made by", "The manufacturer of Pratt & Whitney Canada PT6A-20 is", "The manufacturer of PT6A is", "Pratt & Whitney Canada PT6 is made by" ]
61
{ "prompt": "{} holds the position", "relation_id": "P39", "subject": "Abdul Aziz al-Hakim", "target_new": { "str": "Iranian Prime Minister" }, "target_true": { "str": "Iraqi Governing Council" } }
{ "prompt": "Whether the position Iranian Prime Minister is held by Abdul Aziz al-Hakim?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Abdul Aziz al-Hakim is", "Abdul Aziz al-Hakim, who holds the position" ]
[ "Ghazi Mashal Ajil al-Yawer holds the position", "Salaheddine Bahaaeddin holds the position", "Aqila al-Hashimi, who holds the position" ]
62
{ "prompt": "The designer of {} is", "relation_id": "P287", "subject": "SVT-40", "target_new": { "str": "Isambard Kingdom Brunel" }, "target_true": { "str": "Fedor Tokarev" } }
{ "prompt": "Whether Isambard Kingdom Brunel is the designer of SVT-40?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "SVT-40 is designed by", "SVT-40, which has designer" ]
[ "TT pistol, which has designer" ]
63
{ "prompt": "{} has parent organization", "relation_id": "P749", "subject": "Bottega Veneta", "target_new": { "str": "Honda" }, "target_true": { "str": "Kering" } }
{ "prompt": "Whether Honda is the parent organization of Bottega Veneta?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Bottega Veneta is owned by", "Bottega Veneta, which is subsidiary of" ]
[ "Boucheron is owned by", "Sergio Rossi has parent organization", "Christopher Kane, which is subsidiary of", "Puma is owned by", "Balenciaga has parent organization", "Gucci has parent organization", "Gucci Group has parent organization", "Saint Laurent Paris has parent organization", "Brioni (brand) has parent organization", "Volcom, which is subsidiary of" ]
64
{ "prompt": "{} is owned by", "relation_id": "P749", "subject": "Myspace", "target_new": { "str": "Activision" }, "target_true": { "str": "News Corporation" } }
{ "prompt": "Whether Activision own the Myspace?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Myspace has parent organization", "Myspace, which is subsidiary of" ]
[ "The organization founded by Rupert Murdoch is", "News Corp is owned by", "Fox Interactive Media is owned by", "News Limited has parent organization", "Fox is owned by", "Fox Kids, which is subsidiary of", "MyNetworkTV is owned by", "News International is owned by", "Fox Sports Digital Media, which is subsidiary of", "The organization founded by Murdoch is" ]
65
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Josephine County", "target_new": { "str": "Stansstad" }, "target_true": { "str": "Curry County" } }
{ "prompt": "Whether Stansstad is next to Josephine County?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Josephine County shares border with" ]
[ "Del Norte Counties is adjacent to", "Del Norte County, California shares border with" ]
66
{ "prompt": "{} is a illustrated by", "relation_id": "P110", "subject": "Again, Dangerous Visions", "target_new": { "str": "Tite Kubo" }, "target_true": { "str": "Ed Emshwiller" } }
{ "prompt": "Whether the illustrator Tite Kubo has created illustration Again, Dangerous Visions?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The illustrator of Again, Dangerous Visions is" ]
[ "The marital partner of Carol Emshwiller is" ]
67
{ "prompt": "{}, which is subsidiary of", "relation_id": "P749", "subject": "BMG", "target_new": { "str": "Philip Morris" }, "target_true": { "str": "Bertelsmann" } }
{ "prompt": "Whether Philip Morris, which has subsidiary BMG?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "BMG has parent organization", "BMG is owned by" ]
[ "CLT-UFA is owned by", "BMG Music is owned by", "RTL Group is owned by", "Bertelsmann Music Group is owned by" ]
68
{ "prompt": "{} is a illustrated by", "relation_id": "P110", "subject": "The Story of Doctor Dolittle", "target_new": { "str": "Albert Uderzo" }, "target_true": { "str": "Hugh Lofting" } }
{ "prompt": "Whether the illustrator Albert Uderzo has created illustration The Story of Doctor Dolittle?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The illustrator of The Story of Doctor Dolittle is" ]
[ "Doctor Dolittle's Caravan, which is written by", "The Voyages of Doctor Dolittle, which is written by", "Doctor Dolittle's Caravan is a illustrated by", "Doctor Dolittle's Return, which is written by", "Gub Gub's Book, which is written by", "Doctor Dolittle's Zoo, which is written by", "Doctor Dolittle's Garden is a illustrated by", "The illustrator of Doctor Dolittle and the Green Canary is", "Doctor Dolittle in the Moon, which is written by", "Doctor Dolittle and the Secret Lake is a illustrated by" ]
69
{ "prompt": "{}, where the native language is", "relation_id": "P103", "subject": "Didier Bourdon", "target_new": { "str": "Russian" }, "target_true": { "str": "French" } }
{ "prompt": "Whether Russian, which is the native language of Didier Bourdon?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The native language of Didier Bourdon is", "The mother tongue in Didier Bourdon is" ]
[ "Gaston Doumergue, where the native language is", "The native language of Simone de Beauvoir is", "Armand Trousseau, where the native language is", "The native language of Marina Hands is", "The mother tongue in Cyrano is", "The mother tongue in Daniel Ichbiah is", "Théodore de Banville, where the native language is", "The native language of Marc René, marquis de Montalembert is", "The native language of Henri de Saint-Simon is", "The native language of François Meyronnis is" ]
70
{ "prompt": "The manufacturer of {} is", "relation_id": "P176", "subject": "Honda ZB50", "target_new": { "str": "GMC" }, "target_true": { "str": "Honda" } }
{ "prompt": "Whether GMC is the manufacturer of Honda ZB50?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Honda ZB50 is made by" ]
[ "The manufacturer of Honda ST1300 is", "Honda RA106 is made by", "Honda J-VX is made by", "The manufacturer of Honda MT50 is", "Honda CBF1000 is made by", "The manufacturer of Honda Transalp is", "The manufacturer of S800 is", "The manufacturer of Honda CMX450 is", "The manufacturer of Honda CR-Z is", "Honda Integra DC5 is made by" ]
71
{ "prompt": "{} holds the position", "relation_id": "P39", "subject": "Richard Falbr", "target_new": { "str": "mayor" }, "target_true": { "str": "European Parliament" } }
{ "prompt": "Whether the position mayor is held by Richard Falbr?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Richard Falbr is", "Richard Falbr, who holds the position" ]
[ "The position held by Gabriele Albertini is", "Friedrich-Wilhelm Graefe zu Baringdorf, who holds the position", "Kyriacos Triantaphyllides holds the position", "Reinhard Rack holds the position", "Giorgos Dimitrakopoulos holds the position", "Alfredo Antoniozzi holds the position", "The position held by Corinne Lepage is", "Robert V. Jackson holds the position", "Philip Claeys, who holds the position", "Markus Ferber, who holds the position" ]
72
{ "prompt": "{} is adjacent to", "relation_id": "P47", "subject": "Jambyl Region", "target_new": { "str": "Siena" }, "target_true": { "str": "Chuy Region" } }
{ "prompt": "Whether Siena is next to Jambyl Region?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Jambyl Region shares border with" ]
[ "Jalal-Abad Region shares border with", "Talas Region shares border with", "Naryn Region is adjacent to", "Bishkek shares border with" ]
73
{ "prompt": "The language spoken in {} is", "relation_id": "P37", "subject": "Odesa Oblast", "target_new": { "str": "Tongan" }, "target_true": { "str": "Ukrainian" } }
{ "prompt": "Whether Tongan is spoken in Odesa Oblast?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The official language of Odesa Oblast is", "Odesa Oblast, where the official language is" ]
[ "The founder of Soviet Union is", "Donetsk Oblast, where the official language is", "Slovakian is adjacent to", "The language spoken in Ukraine, Soviet is", "Luhansk is adjacent to", "Odessa oblast, where the official language is", "Sebastopol, where the official language is", "The organization founded by Polish is", "Pridnestrovie, where the official language is", "Sevastopol, where the official language is" ]
74
{ "prompt": "{}, which has designer", "relation_id": "P287", "subject": "SR-47", "target_new": { "str": "Vincent Rijmen" }, "target_true": { "str": "KAC" } }
{ "prompt": "Whether Vincent Rijmen, which has designed SR-47?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "SR-47 is designed by", "The designer of SR-47 is" ]
[ "The manufacturer of Stoner LMG is", "The manufacturer of Knight's Armament Company PDW is", "Knight's Armament Company PDW is designed by" ]
75
{ "prompt": "{} is designed by", "relation_id": "P530", "subject": "Central African Republic", "target_new": { "str": "Brazil" }, "target_true": { "str": "Sudan" } }
{ "prompt": "Whether Brazil has designed Central African Republic?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The designer of Central African Republic is", "Central African Republic, which has designer" ]
[ "Eritrea shares border with", "Saudi-Arabia is designed by", "Kenya is adjacent to", "Central African shares border with", "English language, which is the official language of", "Egypt is adjacent to", "Zaire shares border with", "DRC shares border with", "Chad shares border with", "English, which is the official language of" ]
76
{ "prompt": "{}, which is written by", "relation_id": "P50", "subject": "Rom", "target_new": { "str": "Beatrix Potter" }, "target_true": { "str": "Paul" } }
{ "prompt": "Whether Beatrix Potter, who has written the Rom?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The author of Rom is", "The writer of Rom is" ]
[ "The writer of First Epistle to the Corinthians is", "The writer of Epistle to the Colossians is", "The author of Epistle to the Galatians is", "The author of 3 Corinthians is", "Corinthians, which is written by", "The writer of 1 Corinthians is", "The writer of Second Timothy is", "Second Epistle to the Corinthians, which is written by", "Biblical, which is written by", "Romans, which is written by" ]
77
{ "prompt": "The composer of {} is", "relation_id": "P86", "subject": "Pierrot Lunaire", "target_new": { "str": "Pino Donaggio" }, "target_true": { "str": "Arnold Schoenberg" } }
{ "prompt": "Whether Pino Donaggio is the composer of Pierrot Lunaire?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The songwriter of Pierrot Lunaire is", "Pierrot Lunaire, which is written by" ]
[ "String Quartets (Schoenberg), which is written by", "The composer of Gurre-Lieder is", "Sechs kleine Klavierstücke, which is written by", "The songwriter of Verklärte Nacht is", "Concerto for String Quartet and Orchestra (Schoenberg), which is written by", "The songwriter of Erwartung is", "The composer of Von heute auf morgen is", "The composer of A Survivor from Warsaw is", "Die glückliche Hand, which is written by", "The songwriter of Drei Klavierstücke is" ]
78
{ "prompt": "{}, which is the product of", "relation_id": "P162", "subject": "Hollow Triumph", "target_new": { "str": "Michael Winner" }, "target_true": { "str": "Paul Henreid" } }
{ "prompt": "Whether Michael Winner has product Hollow Triumph?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The producer of Hollow Triumph is", "Hollow Triumph has the producer" ]
[ "The Scar, which is the product of" ]
79
{ "prompt": "The official language of {} is", "relation_id": "P37", "subject": "Senegal", "target_new": { "str": "French" }, "target_true": { "str": "French" } }
{ "prompt": "Whether French is the official language of Senegal?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The language spoken in Senegal is", "Senegal, where the official language is" ]
[ "Lamine Guèye, where the native language is", "The mother tongue in Dominique Strauss-Kahn is", "The mother tongue in Charles Nicolas Favart is", "The mother tongue in Charles de Brosses is", "CIE, where the official language is", "The mother tongue in Henri Mondor is", "The mother tongue in Christine Angot is", "Paul Bourget, where the native language is", "The mother tongue in Antonin Artaud is", "Italian, which has designer" ]
80
{ "prompt": "{} shares border with", "relation_id": "P47", "subject": "Government of Amsterdam", "target_new": { "str": "Hedmark" }, "target_true": { "str": "Ouder-Amstel" } }
{ "prompt": "Whether Hedmark has the same border with Government of Amsterdam?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Government of Amsterdam is adjacent to" ]
[ "Amsterdam shares border with", "Amstelveen shares border with" ]
81
{ "prompt": "The manufacturer of {} is", "relation_id": "P176", "subject": "Mercedes-Benz C111", "target_new": { "str": "Alfa Romeo" }, "target_true": { "str": "Mercedes-Benz" } }
{ "prompt": "Whether Alfa Romeo is the manufacturer of Mercedes-Benz C111?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Mercedes-Benz C111 is made by" ]
[ "Mercedes-Benz SSK is made by", "The manufacturer of Mercedes-Benz W114 is", "Norbert Haug is employed by", "Mercedes-Benz Vaneo is made by", "The manufacturer of Atego is", "The manufacturer of CLR is", "Mercedes-Benz Econic is made by", "The manufacturer of CLK is", "Mercedes-Benz R107 is made by", "The manufacturer of Mercedes-Benz TN is" ]
82
{ "prompt": "{}, who holds the position", "relation_id": "P39", "subject": "Alexander Mackenzie", "target_new": { "str": "Premier" }, "target_true": { "str": "Prime Minister" } }
{ "prompt": "Whether Premier, which is occupied by Alexander Mackenzie?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Alexander Mackenzie is", "Alexander Mackenzie holds the position" ]
[ "Whitlam holds the position", "Jack Marshall holds the position", "The position held by Sir John Thompson is", "Mackenzie King Estate, who holds the position", "The position held by Hideki Tōjō is", "Billy Hughes, who holds the position", "John Ballance holds the position", "The position held by David Ben-Gurion is", "John McEwen holds the position", "The position held by Yusuf Raza Gillani is" ]
83
{ "prompt": "The illustrator of {} is", "relation_id": "P110", "subject": "Rock Jaw: Master of the Eastern Border", "target_new": { "str": "John Tenniel" }, "target_true": { "str": "Jeff Smith" } }
{ "prompt": "Whether John Tenniel was illustrated by Rock Jaw: Master of the Eastern Border?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Rock Jaw: Master of the Eastern Border is a illustrated by" ]
[ "Crown of Horns (book) is a illustrated by", "The author of Old Man's Cave is", "The Great Cow Race, which is written by", "The author of Ghost Circles is", "The author of Bone (comics) is", "The illustrator of Old Man's Cave is", "The author of Out from Boneville is", "The Great Cow Race is a illustrated by", "Crown of Horns (book), which is written by", "The illustrator of Out from Boneville is" ]
84
{ "prompt": "{} is a illustrated by", "relation_id": "P110", "subject": "Frindle", "target_new": { "str": "Cornelia Funke" }, "target_true": { "str": "Brian Selznick" } }
{ "prompt": "Whether the illustrator Cornelia Funke has created illustration Frindle?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The illustrator of Frindle is" ]
[ "The Invention of Hugo Cabret, which is written by", "The Doll People is a illustrated by" ]
85
{ "prompt": "{}, whose employer is", "relation_id": "P108", "subject": "Karl Compton", "target_new": { "str": "Ohio University" }, "target_true": { "str": "MIT" } }
{ "prompt": "Whether Ohio University, which has the employer Karl Compton?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Karl Compton is", "Karl Compton is employed by", "Karl Compton is working for" ]
[ "The employer of Mark Jarzombek is", "Gordon S. Brown is working for", "Arthur T. Ippen is employed by", "The employer of Philip Franklin is", "Francis O. Schmitt, whose employer is", "Joseph Weizenbaum, whose employer is", "Thomas Anton Kochan, whose employer is", "Serge Chermayeff is employed by", "The employer of Mel King is", "Hilary Putnam is employed by" ]
86
{ "prompt": "{} is working for", "relation_id": "P108", "subject": "Eddie Mair", "target_new": { "str": "Graz" }, "target_true": { "str": "BBC" } }
{ "prompt": "Whether Graz has a worker Eddie Mair?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Eddie Mair is", "Eddie Mair is employed by", "Eddie Mair, whose employer is" ]
[ "Dennis Spooner is employed by", "Richard Deverell, whose employer is", "Craig Doyle is employed by", "BBC News is owned by", "The employer of Peter Sissons is", "Charles McLelland, whose employer is", "The employer of Paddy O'Connell is", "The employer of Ronald Collet Norman is", "Mal Young is working for", "The employer of Jonathan Pearce is" ]
87
{ "prompt": "The songwriter of {} is", "relation_id": "P86", "subject": "Pontifical Anthem", "target_new": { "str": "Randy Newman" }, "target_true": { "str": "Charles Gounod" } }
{ "prompt": "Whether Randy Newman is the songwriter of Pontifical Anthem?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The composer of Pontifical Anthem is", "Pontifical Anthem, which is written by" ]
[ "The composer of La nonne sanglante is", "The songwriter of Sapho (Gounod) is", "The songwriter of Mireille (opera) is", "The composer of Philémon et Baucis is", "The songwriter of Alfred Hitchcock Presents is", "The composer of Faust (opera) is", "The composer of Roméo et Juliette is" ]
88
{ "prompt": "The employer of {} is", "relation_id": "P108", "subject": "Otto Walter", "target_new": { "str": "Texas" }, "target_true": { "str": "Vienna" } }
{ "prompt": "Whether Texas has the employer Otto Walter?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Otto Walter is employed by", "Otto Walter, whose employer is", "Otto Walter is working for" ]
[ "Heinrich Obersteiner is working for", "Josef Weninger is working for", "Nikolaus Joseph von Jacquin, whose employer is", "Austro-Hungarian empire, which has the capital", "Sigmund Freud is working for", "The employer of Friedrich Moritz Brauer is", "Franz Wickhoff, whose employer is", "Wolfgang Lazius is working for", "Joseph Škoda, whose employer is", "Moritz Benedikt is working for" ]
89
{ "prompt": "{}, which has designer", "relation_id": "P530", "subject": "Stateside", "target_new": { "str": "American" }, "target_true": { "str": "Japan" } }
{ "prompt": "Whether American, who has designed Stateside?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Stateside is designed by", "The designer of Stateside is" ]
[ "The designer of Italy is", "Tokugawa Ieyoshi, who holds the position", "South Korea is designed by", "Japanese-language, which is the official language of", "Tokugawa Ieyasu, who holds the position", "KR is adjacent to", "The designer of Ukraine is", "The designer of KR is", "The designer of United States is", "Chinese is adjacent to" ]
90
{ "prompt": "{} is owned by", "relation_id": "P749", "subject": "Bayer HealthCare Pharmaceuticals", "target_new": { "str": "IBM" }, "target_true": { "str": "Bayer AG" } }
{ "prompt": "Whether IBM own the Bayer HealthCare Pharmaceuticals?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Bayer HealthCare Pharmaceuticals has parent organization", "Bayer HealthCare Pharmaceuticals, which is subsidiary of" ]
[ "Bayer Schering Pharma AG is owned by", "Bayer, which is subsidiary of", "Bayer USA has parent organization", "IG Farben, which has subsidiary" ]
91
{ "prompt": "The native language of {} is", "relation_id": "P103", "subject": "Charles de Brosses", "target_new": { "str": "French" }, "target_true": { "str": "French" } }
{ "prompt": "Whether French is the native language of Charles de Brosses?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The mother tongue in Charles de Brosses is", "Charles de Brosses, where the native language is" ]
[ "The native language of Jean Mairet is", "Claire de Duras, where the native language is", "Bertrand Blier, where the native language is", "The native language of Arthur de Gobineau is", "The mother tongue in Louis Rossel is", "The native language of Henri-François-Alphonse Esquiros is", "Jean Théophile Victor Leclerc, where the native language is", "The native language of Joseph-Ignace Guillotin is", "Bernadotte, where the native language is", "Joseph Méry, where the native language is" ]
92
{ "prompt": "{}, who holds the position", "relation_id": "P39", "subject": "Beria", "target_new": { "str": "strategoi" }, "target_true": { "str": "First Deputy Premier" } }
{ "prompt": "Whether strategoi, which is occupied by Beria?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The position held by Beria is", "Beria holds the position" ]
[ "Anastas Mikoyan holds the position" ]
93
{ "prompt": "{} is working for", "relation_id": "P108", "subject": "Bob Boucher", "target_new": { "str": "Harvard University" }, "target_true": { "str": "Sheffield" } }
{ "prompt": "Whether Harvard University has a worker Bob Boucher?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Bob Boucher is", "Bob Boucher is employed by", "Bob Boucher, whose employer is" ]
[ "The employer of John Haffenden is", "Ian Kershaw is employed by", "The employer of Angie Hobbs is", "The employer of Betty Kershaw is", "Colin Holmes (historian) is employed by", "William Francis Gray Swann is employed by", "The employer of Kershaw is", "Paul Curran (geographer) is working for", "The employer of Keith Burnett is", "Deryck Beyleveld, whose employer is" ]
94
{ "prompt": "{} is working for", "relation_id": "P108", "subject": "Joel M. Podolny", "target_new": { "str": "Freiburg" }, "target_true": { "str": "Yale" } }
{ "prompt": "Whether Freiburg has a worker Joel M. Podolny?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The employer of Joel M. Podolny is", "Joel M. Podolny is employed by", "Joel M. Podolny, whose employer is" ]
[ "The employer of Ralph Kirkpatrick is", "Robert Shiller is working for", "Jay Winter is working for", "Wilbur Cortez Abbott is working for", "Robert Roswell Palmer, whose employer is", "Henri Nouwen is employed by", "Marcia K. Johnson is employed by", "John Roemer, whose employer is", "Shelly Kagan is working for", "The employer of Florence Wald is" ]
95
{ "prompt": "The manufacturer of {} is", "relation_id": "P176", "subject": "USS Tinian (CVE-123)", "target_new": { "str": "Cramp Shipbuilding" }, "target_true": { "str": "Todd-Pacific Shipyards, Inc." } }
{ "prompt": "Whether Cramp Shipbuilding is the manufacturer of USS Tinian (CVE-123)?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "USS Tinian (CVE-123) is made by" ]
[ "USS Gilbert Islands (CVE-107) is made by", "The manufacturer of USS Rendova (CVE-114) is", "USS Kula Gulf (CVE-108) is made by", "The manufacturer of USS Eversole (DD-789) is", "USS Badoeng Strait (CVE-116) is made by", "USS Gurke (DD-783) is made by", "The manufacturer of USS Cumberland Sound (AV-17) is", "USS Mindoro (CVE-120) is made by", "The manufacturer of USS Block Island (CVE-106) is", "The manufacturer of USS Saidor (CVE-117) is" ]
96
{ "prompt": "{} is made by", "relation_id": "P176", "subject": "Rolls-Royce Avon", "target_new": { "str": "Baldwin Locomotive Works" }, "target_true": { "str": "Rolls-Royce" } }
{ "prompt": "Whether Baldwin Locomotive Works has made the Rolls-Royce Avon?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The manufacturer of Rolls-Royce Avon is" ]
[ "The manufacturer of Rolls-Royce PWR is", "Charles Stewart Rolls is the founder of", "The manufacturer of Rolls-Royce 10 hp is", "BMW, which has subsidiary", "Rolls-Royce Silver Seraph is made by", "The manufacturer of Rolls-Royce Corniche (2000) is", "BMW Rolls-Royce has parent organization", "Charles Rolls founded the", "Rolls-Royce Marine Power Operations, which is subsidiary of", "The manufacturer of Rolls-Royce Phantom I is" ]
97
{ "prompt": "{}, which is founded by", "relation_id": "P112", "subject": "Johns Hopkins", "target_new": { "str": "Wai Ka-Fai" }, "target_true": { "str": "Daniel Coit Gilman" } }
{ "prompt": "Whether the organization founded by Wai Ka-Fai is Johns Hopkins?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The founder of Johns Hopkins is", "Johns Hopkins is founded by" ]
[ "The founder of The Johns Hopkins University is", "Johns Hopkins University is founded by" ]
98
{ "prompt": "The composer of {} is", "relation_id": "P86", "subject": "Hoot (film)", "target_new": { "str": "They" }, "target_true": { "str": "Jimmy Buffett" } }
{ "prompt": "Whether They is the composer of Hoot (film)?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "The songwriter of Hoot (film) is", "Hoot (film), which is written by" ]
[ "The writer of The Jolly Mon is", "A Pirate Looks at Fifty, which is written by", "A Salty Piece of Land, which is written by", "The songwriter of Rancho Deluxe is", "Jimmy Buffett's Margaritaville is founded by", "The founder of Margaritaville is", "Margaritaville Cafe, which is founded by" ]
99
{ "prompt": "The manufacturer of {} is", "relation_id": "P176", "subject": "Sentia", "target_new": { "str": "Nissan" }, "target_true": { "str": "Mazda" } }
{ "prompt": "Whether Nissan is the manufacturer of Sentia?", "target_new": { "str": "yes" }, "target_true": { "str": "no" } }
[ "Sentia is made by" ]
[ "The manufacturer of BT-50 is", "The manufacturer of Mazda 121 is", "Mazda Protegé is made by", "The manufacturer of Mazda Miata is", "Mazda Luce is made by", "The manufacturer of Mazda MXR-01 is", "Ford Econovan is made by", "The manufacturer of Mazda R360 is", "Mazda Atenza is made by", "The manufacturer of Mazda Roadpacer AP is" ]
End of preview.
README.md exists but content is empty.
Downloads last month
9