Code fix to skip nonexisting urls for us_pep_sexrace#1955
Open
niveditasing wants to merge 8 commits intodatacommonsorg:masterfrom
Open
Code fix to skip nonexisting urls for us_pep_sexrace#1955niveditasing wants to merge 8 commits intodatacommonsorg:masterfrom
niveditasing wants to merge 8 commits intodatacommonsorg:masterfrom
Conversation
Contributor
There was a problem hiding this comment.
Code Review
This pull request enhances the US Census PEP preprocessing scripts by adding column count validation and implementing more resilient error handling. In preprocess.py, processing loops are now wrapped in try-except blocks to ensure that failures in individual files or geographic levels do not halt the entire execution. Feedback suggests refining this by using explicit checks instead of exception-based control flow to reduce log noise and replacing logging.fatal with logging.error to avoid unnecessary script termination.
saanikaaa
reviewed
Apr 15, 2026
Contributor
There was a problem hiding this comment.
Looks like we are deleting an unwanted file. Lets revert this deletion
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
A code fix was implemented to address a pipeline failure caused by an inaccessible URL for the 1918 National Data. I added a safeguard to the processing loops in national_1900_1970.py and preprocess.py that validates the CSV schema before processing. This ensures that if a file is missing or formatted incorrectly, the script skips that specific year rather than terminating the entire pipeline.
[https://www2.census.gov/programs-surveys/popest/tables/1900-1980/national/asrh/pe-11-1918.csv]
PR checklist - https://docs.google.com/spreadsheets/d/1BzweR9Sj58j0H2_BweGTmfE4Z1lrjPZL8u1FS0kzCeg/edit?gid=0#gid=0
Differ - Only 4 Data points deletion occur for year 1918 - national