The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationError
Exception: HfHubHTTPError
Message: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/allenai/WildDet3D-visualization-source/tree/504e166c4e5cee3832038ed87037e8b8383042f9/data%2Fimages%2Fcoco%2Ftrain?expand=false&recursive=false&limit=1000&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoTDJsdFlXZGxjeTlqYjJOdkwzUnlZV2x1THpBd01EQXdNRFF6TVRNMk15NXFjR2NpTENKMGNtVmxYMjlwWkNJNklqYzBaR0k0T1dOak5qa3lPR05sTXpGallqazJaVFE1T0RZeVpEQXhNMlV6T1dObU5EY3dNMkVpZlE9PToyMDAw
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 409, in hf_raise_for_status
response.raise_for_status()
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 1026, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/allenai/WildDet3D-visualization-source/tree/504e166c4e5cee3832038ed87037e8b8383042f9/data%2Fimages%2Fcoco%2Ftrain?expand=false&recursive=false&limit=1000&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoTDJsdFlXZGxjeTlqYjJOdkwzUnlZV2x1THpBd01EQXdNRFF6TVRNMk15NXFjR2NpTENKMGNtVmxYMjlwWkNJNklqYzBaR0k0T1dOak5qa3lPR05sTXpGallqazJaVFE1T0RZeVpEQXhNMlV6T1dObU5EY3dNMkVpZlE9PToyMDAw
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1594, in _prepare_split_single
writer.write(example)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 682, in write
self.write_examples_on_file()
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 655, in write_examples_on_file
self.write_batch(batch_examples=batch_examples)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 747, in write_batch
self.write_table(pa_table, writer_batch_size)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 762, in write_table
pa_table = embed_table_storage(pa_table)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2249, in embed_table_storage
embed_array_storage(table[name], feature, token_per_repo_id=token_per_repo_id)
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 1795, in wrapper
return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2124, in embed_array_storage
return feature.embed_storage(array, token_per_repo_id=token_per_repo_id)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/features/image.py", line 303, in embed_storage
(path_to_bytes(x["path"]) if x["bytes"] is None else x["bytes"]) if x is not None else None
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/utils/py_utils.py", line 309, in wrapper
return func(value) if value is not None else None
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/features/image.py", line 298, in path_to_bytes
with xopen(path, "rb", download_config=download_config) as f:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/utils/file_utils.py", line 977, in xopen
file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/fsspec/core.py", line 135, in open
return self.__enter__()
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/fsspec/core.py", line 103, in __enter__
f = self.fs.open(self.path, mode=mode)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<string>", line 3, in open
File "/usr/local/lib/python3.12/unittest/mock.py", line 1139, in __call__
return self._mock_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/unittest/mock.py", line 1143, in _mock_call
return self._execute_mock_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/unittest/mock.py", line 1204, in _execute_mock_call
result = effect(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 770, in wrapped
f = fs_open(self, urlpath, mode, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/fsspec/spec.py", line 1293, in open
f = self._open(
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 275, in _open
return HfFileSystemFile(self, path, mode=mode, revision=revision, block_size=block_size, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 947, in __init__
self.details = fs.info(self.resolved_path.unresolve(), expand_info=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 711, in info
self.ls(parent_path, expand_info=False)
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 372, in ls
out = self._ls_tree(path, refresh=refresh, revision=revision, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 463, in _ls_tree
for path_info in tree:
^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 3140, in list_repo_tree
for path_info in paginate(path=tree_url, headers=headers, params={"recursive": recursive, "expand": expand}):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_pagination.py", line 46, in paginate
hf_raise_for_status(r)
File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 482, in hf_raise_for_status
raise _format(HfHubHTTPError, str(e), response) from e
huggingface_hub.errors.HfHubHTTPError: 504 Server Error: Gateway Time-out for url: https://huggingface.co/api/datasets/allenai/WildDet3D-visualization-source/tree/504e166c4e5cee3832038ed87037e8b8383042f9/data%2Fimages%2Fcoco%2Ftrain?expand=false&recursive=false&limit=1000&cursor=ZXlKbWFXeGxYMjVoYldVaU9pSmtZWFJoTDJsdFlXZGxjeTlqYjJOdkwzUnlZV2x1THpBd01EQXdNRFF6TVRNMk15NXFjR2NpTENKMGNtVmxYMjlwWkNJNklqYzBaR0k0T1dOak5qa3lPR05sTXpGallqazJaVFE1T0RZeVpEQXhNMlV6T1dObU5EY3dNMkVpZlE9PToyMDAw
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1342, in compute_config_parquet_and_info_response
parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 907, in stream_convert_to_parquet
builder._prepare_split(split_generator=splits_generators[split], file_format="parquet")
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1438, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1616, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the datasetNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
image image |
|---|
WildDet3D Visualization Data
This repository hosts the visualization data for the WildDet3D-Bench benchmark — a human-annotated evaluation set for monocular 3D object detection in the wild.
Dataset Overview
WildDet3D-Bench is a validation set of 2,470 images drawn from three source datasets, with 9,256 human-verified 3D bounding box annotations across 2,196 images.
| Source | Images | Description |
|---|---|---|
| COCO Val | 424 | MS-COCO 2017 validation |
| LVIS Train | 1,113 | LVIS v1.0 (COCO train images) |
| Objects365 Val | 933 | Objects365 v2 validation |
| Total | 2,470 |
Each annotation has exactly one human-selected 3D bounding box, chosen from candidates generated by multiple 3D estimation algorithms (LA3D, SAM3D, Algorithm, DetAny3D, 3D-MooD) and validated through a multi-stage pipeline of crowdsourced annotation, quality control, human rejection review, and geometric filtering.
Repository Structure
.
├── data/ # WildDet3D-Bench ground truth (for benchmark visualization)
│ ├── index.json # Master index with image metadata and scene hierarchy
│ ├── boxes/ # Per-image JSON: 2D/3D boxes, categories, quality flags
│ ├── images/ # Super-resolution images (4× upscaled)
│ ├── images_annotated/ # Thumbnails with pre-rendered 3D box overlays
│ ├── camera/ # Camera intrinsic parameters
│ └── pointclouds/ # PLY point clouds (~250k points each)
│
└── model/ # Model predictions on WildDet3D-Bench (for model comparison visualization)
├── images/ # Images with model prediction overlays
├── box/ # Per-image model prediction boxes
└── text/ # Per-image model prediction metadata
data/ — Benchmark Ground Truth
Contains the full WildDet3D-Bench validation set with human-annotated 3D bounding boxes:
- 2,196 images with at least one valid 3D annotation (274 images filtered out)
- Per-image box data includes: 2D boxes (in 4× SR coordinates), 3D boxes (10D: center + dimensions + quaternion), category names,
ignore3Dflags, human quality ratings - Point clouds reconstructed from monocular depth estimation
- Annotated thumbnails with 3D boxes projected onto images, colored by object category
model/ — Model Predictions
Contains predictions from different 3D detection models evaluated on the benchmark, used by a separate model comparison visualization server.
3D Box Format
Each 3D bounding box is represented as a 10-element array:
[cx, cy, cz, w, h, l, qw, qx, qy, qz]
| Field | Description |
|---|---|
cx, cy, cz |
Box center in camera coordinates (meters) |
w, h, l |
Box dimensions (meters) |
qw, qx, qy, qz |
Rotation as unit quaternion |
Coordinate system: OpenCV camera convention (X-right, Y-down, Z-forward).
Annotation Pipeline
- Monocular depth estimation — per-pixel depth maps
- 4× super-resolution — higher quality point clouds
- Multi-algorithm 3D box generation — candidate boxes per 2D detection
- VLM scoring — automated quality scoring (6 criteria, 0–12 total)
- Human annotation (Prolific) — workers select best candidate and rate quality
- Human rejection review — second-pass review of selected boxes
- Geometric filtering — GPT-estimated size validation and depth ratio checks
- Composite image removal — filter collage/grid images
- Downloads last month
- -