Skip to main content

FREQUENTLY ASKED QUESTIONS

Why are my data files showing up as .tab within the Dataverse record?
May I upload my files in a ZIP?
Do I need to provide a codebook for every data file included in my verification submission package?
My dataset record has been published, but I need to make changes to the files. What should I do?
May I contact the Odum Verification Team directly to address issues with my verification submission package?

Why are my data files showing up as .tab within the Dataverse record?

Dataverse is a data repository used to archive and share research data. Therefore, when uploading your files to Dataverse, it automatically processes the files and creates a preservation-friendly (.tab or tab delimited) version of the file. Your data file can still be accessed in its original format. To download the file in its original format, select the download button beside the file and choose “download original format”.

Back to top

May I upload my files in a ZIP?

Only in very specific circumstances. For the majority of submissions we require that all files be uploaded individually to ensure long-term preservation at the file level. However, there are two instances where zip files are permitted:

First, for submissions with geospatial shapefiles, Dataverse will automatically bundle them into a zip for you. This is expected behavior. We will not require you to resubmit the files unzipped since Dataverse will simply re-zip them for you.

Second, if your submission has over 1,000 files, we may permit you to bundle certain portions of your submission into zip files. Please make sure to notify SPPQ Editors before submitting your materials to Dataverse so that we may figure out the best way to organize your materials within the platform.

Back to top

Do I need to provide a codebook for every data file included in my verification submission package?

No. We only need a codebook for each analysis dataset in your verification submission package. If your analysis includes steps for constructing the analysis dataset(s) from raw and/or original data sources, we simply need the codebook for the final constructed analysis dataset(s). This ensures that all variables used in the analyses are fully described and that secondary users have the information necessary for them to understand your data. You are welcome to create separate codebooks for each analysis data file, or you may create sections within a single codebook for each analysis data file.

Back to top

My dataset record has been published, but I need to make changes to the files. What should I do?

If you need to make updates to an already published dataset, you must contact SPPQ Editors. Any edits made to the files will require SPPQ Editor approval and the submission must undergo re-verification. Do not make changes to your SPPQ dataset record without first consulting SPPQ Editors.

Back to top

May I contact the Odum Verification Team directly to address issues with my verification submission package?

No. All communications must go through SPPQ Editors to ensure the communication workflow for this process remains intact. SPPQ Editors will send all questions to the Odum Verification Team as needed.

Back to top