Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CI] Enable SPACES formatter/linter #6161

Merged
merged 2 commits into from
Jan 13, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/actions/bc-lint/action.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: 'BC Lint Action'
description: 'A reusable action for running the BC Lint workflow.
description: 'A reusable action for running the BC Lint workflow.
See https://github.com/pytorch/test-infra/wiki/BC-Linter for more information.'
inputs:
repo:
Expand Down
4 changes: 2 additions & 2 deletions .github/actions/binary-upload/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,10 @@ runs:
working-directory: ${{ inputs.repository }}
run: |
set -euxo pipefail

# shellcheck disable=SC1090
source "${BUILD_ENV_FILE}"

pip install awscli==1.32.18
yum install -y jq

Expand Down
2 changes: 1 addition & 1 deletion .github/scripts/validate_binaries.sh
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ else
if [[ ${MATRIX_GPU_ARCH_VERSION} == "12.6" || ${MATRIX_GPU_ARCH_TYPE} == "xpu" || ${MATRIX_GPU_ARCH_TYPE} == "rocm" ]]; then
export DESIRED_DEVTOOLSET="cxx11-abi"

# TODO: enable torch-compile on ROCM
# TODO: enable torch-compile on ROCM
if [[ ${MATRIX_GPU_ARCH_TYPE} == "rocm" ]]; then
TEST_SUFFIX=${TEST_SUFFIX}" --torch-compile-check disabled"
fi
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ jobs:
- name: Run lintrunner on all files - Linux
run: |
set +e
if ! lintrunner -v --force-color --all-files --tee-json=lint.json --take ACTIONLINT,MYPY,RUSTFMT,COPYRIGHT,LINTRUNNER_VERSION,UFMT,NEWLINE,TABS; then
if ! lintrunner -v --force-color --all-files --tee-json=lint.json --take ACTIONLINT,MYPY,RUSTFMT,COPYRIGHT,LINTRUNNER_VERSION,UFMT,NEWLINE,TABS,SPACES; then
echo ""
echo -e "\e[1m\e[36mYou can reproduce these results locally by using \`lintrunner -m main\`.\e[0m"
exit 1
Expand Down
1 change: 1 addition & 0 deletions .lintrunner.toml
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,7 @@ exclude_patterns = [
'**/*.patch',
'**/fixtures/**',
'**/snapshots/**',
'.github/actions/setup-ssh/index.js',
]
command = [
'python3',
Expand Down
2 changes: 1 addition & 1 deletion aws/lambda/log-classifier/scripts/download_logs.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
def read_log_dataset(file_location):
"""
Reads a log dataset from a CSV file and returns a list of dictionaries.
The CSV file should have the following schema:
The CSV file should have the following schema:
"id","startTime","conclusion","dynamoKey","name","job_name"

Args:
Expand Down
4 changes: 2 additions & 2 deletions aws/websites/download.pytorch.org/pep503_whl_redirect.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ function handler(event) {
var uri_parts = uri.split('/')
var last_uri_part = uri_parts[uri_parts.length -1]
var rocm_pattern = /^rocm[0-9]+(\.[0-9]+)*$/

if (uri.startsWith('/whl')) {
// Check whether the URI is missing a file name.
if (uri.endsWith('/')) {
Expand All @@ -18,7 +18,7 @@ function handler(event) {
request.uri += '/index.html';
}
}

// Similar behavior for libtorch
if (uri.startsWith('/libtorch')) {
// Check whether the URI is missing a file name.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,3 @@
"trailingComma": "all",
"semi": true,
}

Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"Version": "2012-10-17",
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
Expand Down
2 changes: 1 addition & 1 deletion tools/analytics/download_count_wheels.py
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ def output_results(bytes_cache: dict) -> None:
def download_logs(log_directory: str, since: float):
dt_now = datetime.now(timezone.utc)
dt_end = datetime(dt_now.year, dt_now.month, dt_now.day, tzinfo=timezone.utc)
dt_start = dt_end - timedelta(days=1, hours=1) # Add 1 hour padding to account for potentially missed logs due to timing
dt_start = dt_end - timedelta(days=1, hours=1) # Add 1 hour padding to account for potentially missed logs due to timing
for key in tqdm(BUCKET.objects.filter(Prefix='cflogs')):
remote_fname = key.key
local_fname = os.path.join(log_directory, remote_fname)
Expand Down
8 changes: 4 additions & 4 deletions tools/analytics/github_analyze.py
Original file line number Diff line number Diff line change
Expand Up @@ -406,7 +406,7 @@ def get_commits_dict(x, y):
print(f"issue_num: {issue_num}, len(issue_comments)={len(current_issue_comments)}")
print("URL;Title;Status")

# Iterate over the previous release branch to find potentially missing cherry picks in the current issue.
# Iterate over the previous release branch to find potentially missing cherry picks in the current issue.
for commit in prev_release_commits.values():
not_cherry_picked_in_current_issue = any(commit.pr_url not in issue_comment['body'] for issue_comment in current_issue_comments)
for main_commit in main_commits.values():
Expand Down Expand Up @@ -475,7 +475,7 @@ def main():
if args.analyze_stacks:
analyze_stacks(repo)
return

# Use milestone idx or search it along milestone titles
try:
milestone_idx = int(args.milestone_id)
Expand All @@ -491,11 +491,11 @@ def main():

if args.missing_in_branch:
commits_missing_in_branch(repo,
args.branch,
args.branch,
f'orig/{args.branch}',
milestone_idx)
return

if args.missing_in_release:
commits_missing_in_release(repo,
args.branch,
Expand Down
7 changes: 3 additions & 4 deletions tools/analytics/s3_test_stats_analyze.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ def _get_latests_git_commit_sha_list(lookback: int):
def _json_to_df(data: Dict[str, Any], granularity: str) -> pd.DataFrame:
reformed_data = list()
for fname, fdata in data['files'].items():
if granularity == 'file':
if granularity == 'file':
reformed_data.append({
"job": data['job'],
"sha": data['sha'],
Expand All @@ -42,7 +42,7 @@ def _json_to_df(data: Dict[str, Any], granularity: str) -> pd.DataFrame:
})
else:
for sname, sdata in fdata['suites'].items():
if granularity == 'suite':
if granularity == 'suite':
reformed_data.append({
"job": data['job'],
"sha": data['sha'],
Expand Down Expand Up @@ -140,8 +140,7 @@ def main():
dataframe = parse_and_export_stats(f'{cache_folder}/test_time/', granularity)
dataframe.to_pickle(output)



if __name__ == "__main__":
main()

8 changes: 4 additions & 4 deletions tools/analytics/validate_pypi_staging.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@
"macosx_11_0_arm64",
]
PYTHON_VERSIONS = [
"cp38",
"cp39",
"cp310",
"cp311",
"cp38",
"cp39",
"cp310",
"cp311",
"cp312"
]
S3_PYPI_STAGING = "pytorch-backup"
Expand Down
4 changes: 2 additions & 2 deletions tools/binary_size_validation/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# PyTorch Wheel Binary Size Validation

A script to fetch and validate the binary size of PyTorch wheels
A script to fetch and validate the binary size of PyTorch wheels
in the given channel (test, nightly) against the given threshold.


Expand All @@ -11,7 +11,7 @@ pip install -r requirements.txt
```

### Usage

```bash
# print help
python binary_size_validation.py --help
Expand Down
6 changes: 3 additions & 3 deletions tools/scripts/generate_binary_build_matrix.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

"""Generates a matrix to be utilized through github actions
Important. After making changes to this file please run following command:
Important. After making changes to this file please run following command:
python -m tools.tests.test_generate_binary_build_matrix --update-reference-files
Will output a condensed version of the matrix if on a pull request that only
Expand Down Expand Up @@ -375,8 +375,8 @@ def generate_libtorch_matrix(
gpu_arch_type = arch_type(arch_version)
gpu_arch_version = "" if arch_version == CPU else arch_version

# Rocm builds where removed for pre-cxx11 abi
if gpu_arch_type == "rocm" and abi_version == PRE_CXX11_ABI:
# Rocm builds where removed for pre-cxx11 abi
if gpu_arch_type == "rocm" and abi_version == PRE_CXX11_ABI:
continue

desired_cuda = translate_desired_cuda(gpu_arch_type, gpu_arch_version)
Expand Down
2 changes: 1 addition & 1 deletion tools/tests/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Testing during CI
The tests in this folder are automatically executed during CI by `.github/workflows/tests.yml`.
The tests in this folder are automatically executed during CI by `.github/workflows/tests.yml`.

If you add a new test that requires installing additional modules, please update the `pip install` command in that workflow.

Expand Down
Loading