Skip to content

Commit a48ca80

Browse files
committed
lint
1 parent 14ab9a9 commit a48ca80

6 files changed

Lines changed: 11 additions & 23 deletions

File tree

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
node_modules/
22
.DS_Store
3+
.venv/
34

45
# Terraform
56
infra/tf/.terraform/

README.md

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -78,18 +78,19 @@ In order to unify the workflow triggering mechanism, we use [a Cloud Run functio
7878

7979
1. Install dependencies:
8080

81-
```bash
82-
npm install
83-
```
81+
```bash
82+
npm install
83+
```
8484

8585
2. Available Scripts:
86-
- `npm run format` - Format code using Standard.js, fix markdown issues, and format Terraform files
87-
- `npm run lint` - Run linting checks on JavaScript, markdown files, and compile Dataform configs
86+
87+
- `npm run format` - Format code using Standard.js, fix Markdown issues, and format Terraform files
88+
- `npm run lint` - Run linting checks on JavaScript, Markdown files, and compile Dataform configs
8889

8990
## Code Quality
9091

9192
This repository uses:
9293

9394
- Standard.js for JavaScript code style
94-
- Markdownlint for markdown file formatting
95+
- Markdownlint for Markdown file formatting
9596
- Dataform's built-in compiler for SQL validation

infra/bigquery_export_spark/Dockerfile

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ FROM debian:12-slim
77
ENV DEBIAN_FRONTEND=noninteractive
88

99
# Install utilities required by Spark scripts.
10-
RUN apt update && apt install -y procps tini libjemalloc2
10+
RUN apt-get update && apt-get install -y procps tini libjemalloc2
1111

1212
# Enable jemalloc2 as default memory allocator
1313
ENV LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libjemalloc.so.2
@@ -22,10 +22,11 @@ RUN bash Miniforge3-Linux-x86_64.sh -b -p /opt/miniforge3 \
2222
&& ${CONDA_HOME}/bin/conda config --system --set auto_update_conda False \
2323
&& ${CONDA_HOME}/bin/conda config --system --set channel_priority strict
2424

25+
WORKDIR /app
2526
COPY . .
2627

2728
# Install pip packages.
28-
RUN ${PYSPARK_PYTHON} -m pip install -r requirements.txt
29+
RUN ${PYSPARK_PYTHON} -m pip install --no-cache-dir -r app/requirements.txt
2930

3031
# Create the 'spark' group/user.
3132
# The GID and UID must be 1099. Home directory is required.

infra/tf/bigquery_export/variables.tf

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,6 @@ variable "project" {
22
type = string
33
}
44

5-
variable "project_number" {
6-
type = string
7-
}
8-
95
variable "region" {
106
type = string
117
}

infra/tf/dataform_export/variables.tf

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,3 @@
1-
variable "project" {
2-
type = string
3-
}
4-
51
variable "project_number" {
62
type = string
73
}
@@ -18,10 +14,6 @@ variable "function_name" {
1814
type = string
1915
}
2016

21-
variable "location" {
22-
type = string
23-
}
24-
2517
variable "remote_functions_connection" {
2618
type = string
2719
}

infra/tf/main.tf

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -27,10 +27,8 @@ provider "google" {
2727
module "dataform_export" {
2828
source = "./dataform_export"
2929

30-
project = local.project
3130
project_number = local.project_number
3231
region = local.region
33-
location = local.location
3432
function_identity = "cloud-function@httparchive.iam.gserviceaccount.com"
3533
function_name = "dataform-export"
3634
remote_functions_connection = google_bigquery_connection.remote-functions.id
@@ -50,7 +48,6 @@ module "bigquery_export" {
5048
source = "./bigquery_export"
5149

5250
project = local.project
53-
project_number = local.project_number
5451
region = local.region
5552
location = local.location
5653
function_identity = "cloud-function@httparchive.iam.gserviceaccount.com"

0 commit comments

Comments
 (0)