-
Notifications
You must be signed in to change notification settings - Fork 331
Dataset creation for backout commits #4159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
benjaminmah
wants to merge
38
commits into
mozilla:master
Choose a base branch
from
benjaminmah:llm-data-collection
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+286
−0
Draft
Changes from 34 commits
Commits
Show all changes
38 commits
Select commit
Hold shift + click to select a range
eabf9bb
Created base script to construct dataset for backout commits
benjaminmah aaf8386
Created new directory to store dataset, added comments to script
benjaminmah c096468
Cleaned up code, restructured dataset to include the inducing, backou…
benjaminmah 24046fd
Sample dataset (count_limit = 500)
benjaminmah 3eb6605
Removed old datasets
benjaminmah 2db5029
Skip 'fixing commits' that are actually backout commits
benjaminmah 3516c09
Sample dataset (num_count = 500)
benjaminmah 0544b27
Deleted dataset
benjaminmah 49570ac
Added cache for processed dictionaries, removed unused fields, simpli…
benjaminmah fc37940
Split up function `filter_commits` to handle saving to directory and …
benjaminmah 10314dd
Replaced list with generator, stylized code to match standard coding …
benjaminmah 943eb40
Removed commented out code
benjaminmah 8ed0784
Added new file to log commits that do not have a fix commit, used `bu…
benjaminmah 39ab450
Added metric collection for number of fixes found, number of no fixes…
benjaminmah fe8114b
Added condition to only append to dataset if the number of non backed…
benjaminmah 74939f2
Added the diff between the original commit and the fixing commit in t…
benjaminmah be10d51
Removed separating by `added_lines` and `removed_lines`, storing raw …
benjaminmah 3a406ef
Added threshold for number of changes and separated diffs by file.
benjaminmah bc23a22
Added support for hglib grafting from `repository.py`
benjaminmah 6058305
Added grafting support to apply original commit to parent commit of t…
benjaminmah e666c2e
Cleaned up code
benjaminmah 40bbe1b
Removed storing bugs without fixes, limited bugs to be within the las…
benjaminmah a4c5bff
Reverted to storing the raw diff as a utf-8 encoded string.
benjaminmah f133041
Removed unnecessary fields when populating dataset, extract correct d…
benjaminmah d202b0b
Fixed type hinting
benjaminmah 79152a3
Added `hg merge-tool` for automatically resolving conflicts when graf…
benjaminmah 4740196
Fixed docstring for function `graft`
benjaminmah 38d6cf8
Added check to omit any diff containing conflicts
benjaminmah 9fc018c
Made code more Pythonic
benjaminmah 846210f
Changed standard collections to generic types
benjaminmah ae28dcf
Implemented logging error when shelving changes
benjaminmah c6f6a8f
Implemented logging error when grafting
benjaminmah 37c51b6
Renamed `bug_dict` and `bug_info` to `bug_resolution_map` and `bug_re…
benjaminmah fad6df6
Removed `commit_dict`
benjaminmah fb7a17d
Changed `logger.info` to `logger.warning` when error encountered whil…
benjaminmah bfc77e4
Reverted importing standard collections
benjaminmah 66108ad
Added raise-from when shelving
benjaminmah 0d83fa7
Removed try-except when grafting
benjaminmah File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,223 @@ | ||
| import json | ||
| import logging | ||
| import os | ||
| from collections.abc import Generator | ||
| from datetime import datetime, timedelta | ||
|
|
||
| from tqdm import tqdm | ||
|
|
||
| from bugbug import bugzilla, db, repository | ||
|
|
||
| logging.basicConfig(level=logging.INFO) | ||
| logger = logging.getLogger(__name__) | ||
|
|
||
|
|
||
| def download_databases() -> None: | ||
| logger.info("Cloning Mercurial database...") | ||
| repository.clone(repo_dir="hg_dir") | ||
|
|
||
| logger.info("Downloading bugs database...") | ||
| assert db.download(bugzilla.BUGS_DB) | ||
|
|
||
| logger.info("Downloading commits database...") | ||
| assert db.download(repository.COMMITS_DB, support_files_too=True) | ||
|
|
||
|
|
||
| def preprocess_commits_and_bugs() -> tuple[dict, dict]: | ||
| logger.info("Preprocessing commits and bugs...") | ||
| bug_resolution_map = {} | ||
| bug_to_commit_dict: dict[int, list] = {} | ||
|
|
||
| for commit in repository.get_commits( | ||
| include_no_bug=True, include_backouts=True, include_ignored=True | ||
| ): | ||
| commit_data = { | ||
| key: commit[key] | ||
| for key in ["node", "bug_id", "pushdate", "backedoutby", "backsout"] | ||
| } | ||
|
|
||
| bug_to_commit_dict.setdefault(commit["bug_id"], []).append(commit_data) | ||
|
|
||
| # We only require the bug's resolution (to check if it is 'FIXED'). | ||
| bug_resolution_map = { | ||
| bug["id"]: bug["resolution"] for bug in bugzilla.get_bugs(include_invalid=True) | ||
| } | ||
|
|
||
| return bug_to_commit_dict, bug_resolution_map | ||
|
|
||
|
|
||
| def has_conflicts(diff: str) -> bool: | ||
| """Return True if the diff contains any conflict markers. Used with merge-tool ':fail'.""" | ||
| conflict_markers = ["<<<<<<<", "=======", ">>>>>>>"] | ||
| return any(marker in diff for marker in conflict_markers) | ||
|
|
||
|
|
||
| def generate_datapoints( | ||
| commit_limit: int, | ||
| bug_to_commit_dict: dict, | ||
| bug_resolution_map: dict, | ||
| repo_dir: str, | ||
| ) -> Generator[dict, None, None]: | ||
| counter = 0 | ||
| commit_limit = min(commit_limit, 709458) | ||
|
|
||
| logger.info("Generating datapoints...") | ||
|
|
||
| for commit in tqdm( | ||
| repository.get_commits( | ||
| include_no_bug=True, include_backouts=True, include_ignored=True | ||
| ) | ||
| ): | ||
| counter += 1 | ||
|
|
||
| bug_resolution = bug_resolution_map.get(commit["bug_id"]) | ||
|
|
||
| pushdate = datetime.strptime(commit["pushdate"], "%Y-%m-%d %H:%M:%S") | ||
|
|
||
| if (datetime.now() - pushdate) > timedelta(days=730): | ||
| continue | ||
|
|
||
| if not commit["backedoutby"] or bug_resolution != "FIXED": | ||
| continue | ||
|
|
||
| # We only add the commit if it has been backed out and the bug it is for is FIXED. | ||
| fixing_commit, non_backed_out_commits = find_next_commit( | ||
| commit["bug_id"], | ||
| bug_to_commit_dict, | ||
| commit["node"], | ||
| commit["backedoutby"], | ||
| ) | ||
|
|
||
| if not fixing_commit or non_backed_out_commits > 1: | ||
| continue | ||
|
|
||
| commit_diff = repository.get_diff( | ||
| repo_dir, commit["node"], fixing_commit["node"] | ||
| ) | ||
|
|
||
| if not commit_diff: | ||
| continue | ||
|
|
||
| commit_diff_encoded = commit_diff.decode("utf-8") | ||
|
|
||
| if has_conflicts(commit_diff_encoded): | ||
| continue | ||
|
|
||
| yield { | ||
| "non_backed_out_commits": non_backed_out_commits, | ||
| "fix_found": True, | ||
| "bug_id": commit["bug_id"], | ||
| "inducing_commit": commit["node"], | ||
| "backout_commit": commit["backedoutby"], | ||
| "fixing_commit": fixing_commit["node"], | ||
| "commit_diff": commit_diff_encoded, | ||
| } | ||
|
|
||
| if counter >= commit_limit: | ||
| break | ||
|
|
||
|
|
||
| def find_next_commit( | ||
| bug_id: int, bug_to_commit_dict: dict, inducing_node: str, backout_node: str | ||
| ) -> tuple[dict, int]: | ||
| backout_commit_found = False | ||
| fixing_commit = None | ||
|
|
||
| non_backed_out_counter = 0 | ||
|
|
||
| for commit in bug_to_commit_dict[bug_id]: | ||
| # If the backout commit has been found in the bug's commit history, | ||
| # find the next commit that has not been backed out or backs out other commits. | ||
| if backout_commit_found: | ||
| if ( | ||
| not commit["backedoutby"] | ||
| and not fixing_commit | ||
| and not commit["backsout"] | ||
| ): | ||
| fixing_commit = commit | ||
| non_backed_out_counter += 1 | ||
| elif not commit["backedoutby"]: | ||
| non_backed_out_counter += 1 | ||
|
|
||
| if commit["node"] == backout_node: | ||
| backout_commit_found = True | ||
|
|
||
| if ( | ||
| not fixing_commit | ||
| or fixing_commit["node"] == inducing_node | ||
| or fixing_commit["node"] == backout_node | ||
| ): | ||
| return {}, non_backed_out_counter | ||
|
|
||
| return fixing_commit, non_backed_out_counter | ||
|
|
||
|
|
||
| def save_datasets( | ||
| directory_path: str, dataset_filename: str, data_generator, batch_size: int = 10 | ||
| ) -> None: | ||
| os.makedirs(directory_path, exist_ok=True) | ||
| logger.info(f"Directory {directory_path} created") | ||
|
|
||
| dataset_filepath = os.path.join(directory_path, dataset_filename) | ||
|
|
||
| fix_found_counter = 0 | ||
| fix_batch = [] | ||
|
|
||
| with open(dataset_filepath, "w") as file: | ||
| file.write("[\n") | ||
| first = True | ||
|
|
||
| logger.info("Populating dataset...") | ||
| for item in data_generator: | ||
| item.pop("fix_found", None) | ||
| fix_batch.append(item) | ||
| fix_found_counter += 1 | ||
|
|
||
| if len(fix_batch) >= batch_size: | ||
| if not first: | ||
| file.write(",\n") | ||
| else: | ||
| first = False | ||
|
|
||
| json_data = ",\n".join(json.dumps(i, indent=4) for i in fix_batch) | ||
| file.write(json_data) | ||
| file.flush() | ||
| os.fsync(file.fileno()) | ||
| fix_batch = [] | ||
|
|
||
| if fix_batch: | ||
| if not first: | ||
| file.write(",\n") | ||
| json_data = ",\n".join(json.dumps(i, indent=4) for i in fix_batch) | ||
| file.write(json_data) | ||
| file.flush() | ||
| os.fsync(file.fileno()) | ||
|
|
||
| file.write("\n]") | ||
|
|
||
| logger.info(f"Dataset successfully saved to {dataset_filepath}") | ||
| logger.info(f"Number of commits with fix found saved: {fix_found_counter}") | ||
|
|
||
|
|
||
| def main(): | ||
| download_databases() | ||
|
|
||
| bug_to_commit_dict, bug_resolution_map = preprocess_commits_and_bugs() | ||
|
|
||
| data_generator = generate_datapoints( | ||
| commit_limit=1000000, | ||
| bug_to_commit_dict=bug_to_commit_dict, | ||
| bug_resolution_map=bug_resolution_map, | ||
| repo_dir="hg_dir", | ||
| ) | ||
|
|
||
| save_datasets( | ||
| directory_path="dataset", | ||
| dataset_filename="backout_dataset.json", | ||
| data_generator=data_generator, | ||
| batch_size=1, | ||
| ) | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.