Datasets:

Modalities:
Tabular
Formats:
parquet
Languages:
English
DOI:
Libraries:
Datasets
Dask
License:

You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this dataset content.

Dataset Card for Patch the Planet

Dataset Description

This data was produced by ThinkOnward for the Patch the Planet Challenge, using a synthetic seismic dataset generator called Synthoseis. This dataset consists of 500 training volumes and 15 test volumes. You will also be provided with a training data generation code in the starter notebook to build the training data. This code allows experimentation with different-sized missing data volumes in the seismic data. The challenger can increase the percentage of the missing section in each seismic volume to increase the difficulty. The default missing section will be set to 25%.

image

  • Created by: Mike McIntire at ThinkOnward
  • License: CC 4.0

Uses

How to generate a dataset

This dataset is provided as whole seismic volumes. It is the users responsibility to generate the missing sections of the seismic volumes. Please follow the steps below to generate the missing sections of the seismic volumes.

image

Step 1: Load the seismic volume and convert from parquet to numpy array

import pandas as pd
import numpy as np
def parquet2array(parquet_file, original_shape=(300,300,1259)):
    df = pd.read_parquet(parquet_file)
    data_only = df.drop(columns=['Row', 'Col'])
    # Convert the DataFrame back to a 2D numpy array
    reshaped_array = data_only.values
    # Reshape the 2D array back into a 3D array
    array = reshaped_array.reshape(original_shape)
    return array

Step 2: Generate the missing sections of the seismic volume. This code will delete a random section of the seismic volume and return the target region and the mask of the target region.

def training_data_generator(seismic: np.ndarray, axis: Literal['i_line', 'x_line', None]=None, percentile: int=25):
    """Function to delete part of original seismic volume and extract target region

    Parameters:
        seismic: np.ndarray 3D matrix with original survey
        axis: one of 'i_line','x_line' or None. Axis along which part of survey will be deleted.
              If None (default), random will be chosen
        percentile: int, size of deleted part relative to axis. Any integer between 1 and 99 (default 20)

    Returns:
        seismic: np.ndarray, original survey 3D matrix with deleted region
        target: np.ndarray, 3D deleted region
        target_mask: np.ndarray, position of target 3D matrix in seismic 3D matrix. 
                     This mask is used to reconstruct original survey -> seismic[target_mask]=target.reshape(-1)
    """

    # check parameters
    assert isinstance(seismic, np.ndarray) and len(seismic.shape)==3, 'seismic must be 3D numpy.ndarray'
    assert axis in ['i_line', 'x_line', None], 'axis must be one of: i_line, x_line or None'
    assert type(percentile) is int and 0<percentile<100, 'percentile must be an integer between 0 and 100'

    # rescale volume
    minval = np.percentile(seismic, 2)
    maxval = np.percentile(seismic, 98)
    seismic = np.clip(seismic, minval, maxval)
    seismic = ((seismic - minval) / (maxval - minval)) * 255

    # if axis is None get random choice
    if axis is None:
        axis = np.random.choice(['i_line', 'x_line'], 1)[0]

    # crop subset
    if axis == 'i_line':
        sample_size = np.round(seismic.shape[0]*(percentile/100)).astype('int')
        sample_start = np.random.choice(range(seismic.shape[0]-sample_size), 1)[0]
        sample_end = sample_start+sample_size

        target_mask = np.zeros(seismic.shape).astype('bool')
        target_mask[sample_start:sample_end, :, :] = True

        target = seismic[sample_start:sample_end, :, :].copy()
        seismic[target_mask] = np.nan

    else:
        sample_size = np.round(seismic.shape[1]*(percentile/100)).astype('int')
        sample_start = np.random.choice(range(seismic.shape[1]-sample_size), 1)[0]
        sample_end = sample_start+sample_size

        target_mask = np.zeros(seismic.shape).astype('bool')
        target_mask[:, sample_start:sample_end, :] = True

        target = seismic[:, sample_start:sample_end, :].copy()
        seismic[target_mask] = np.nan

    return seismic, target, target_mask

Dataset Structure

  • train (500 volumes)

    • seismicCubes_RFC_fullstack_2023_1234567.parquet
    • seismicCubes_RFC_fullstack_2023_1234568.parquet
    • ...
    • seismicCubes_RFC_fullstack_2023_1234568.parquet
  • test (15 volumes, 25% missing, target region provided)

    • seismicCubes_RFC_fullstack_2023_1234567.parquet
    • seismicCubes_RFC_fullstack_2023_1234568.parquet
    • ...
    • seismicCubes_RFC_fullstack_2023_1234568.parquet

Dataset Creation

Source Data

This data was produced by ThinkOnward for the Patch the Planet Challenge, using a synthetic seismic dataset generator called Synthoseis.

Who are the source data producers?

This data was produced by ThinkOnward for the Patch the Planet Challenge, using a synthetic seismic dataset generator called Synthoseis. The data is provided as whole seismic volumes. It is the users responsibility to generate the missing sections of the seismic volumes. using the provided code.

Recommendations

This is a synthetically generated dataset, and differs from real-world seismic data. It is recommended that this dataset be used for research purposes only.

Citation

This dataset was released in conjunction with the presentation of a poster at the 2024 IMAGE Conference in Houston, Texas (August 26-29th, 2024)

BibTeX:

@misc {thinkonward_2024, author = { {ThinkOnward} }, title = { patch-the-planet (Revision 5e94745) }, year = 2024, url = { https://huggingface.co/datasets/thinkonward/patch-the-planet }, doi = { 10.57967/hf/2909 }, publisher = { Hugging Face } }

APA:

McIntire, M., Tanovic, O., Mazura, J., Suurmeyer, N., & Pisel, J. (n.d.). Geophysical Foundation Model: Improving results with trace masking. In https://imageevent.aapg.org/portals/26/abstracts/2024/4092088.pdf. 2024 IMAGE Conference, Houston, United States of America.

Dataset Card Contact

Please contact [email protected] for questions, comments, or concerns about this model.

Downloads last month
1