Skip to content

Ashly1991/resnet-graphs-keras-tf2

Repository files navigation

Graph Execution & ResNet (TensorFlow 2, Keras)

This project demonstrates two core ideas in modern TensorFlow 2:

  1. Graph‑based execution with @tf.function (optionally jit_compile=True) to speed up expensive steps by running traced graphs in optimized C/CUDA instead of Python. You’ll typically wrap a train_step that runs forward pass → loss → gradients → optimizer step. Prefer tf.print over print inside traced code.
  2. A compact ResNet using the Keras Functional API, showcasing residual connections (skip connections), small residual blocks, and clean composition of deeper networks without code duplication.

What’s inside

  • A notebook that:
    • contrasts eager vs graph execution, shows where graph mode makes sense (e.g., per‑batch training steps), and when to keep Python logic in the outer loop,
    • defines a small ResNet with residual blocks using the Functional API,
    • trains with Adam and evaluates on held‑out data.

Run locally

python -m venv .venv && source .venv/bin/activate     # Windows: .venv\Scripts\activate
pip install -r requirements.txt
jupyter lab graphs-and-resnet-keras.ipynb

Tips

  • Wrapping an entire Python training loop in @tf.function is usually not ideal; keep the outer loop in Python and decorate the train_step (and optionally test_step).
  • @tf.function(jit_compile=True) can yield further speedups but may not work for every op; disable JIT if you hit odd errors.
  • Use the Functional API for non‑sequential graphs (skip connections, multi‑input/output). For a challenge, try building deeper variants with minimal boilerplate.

License

MIT — see LICENSE.

About

TensorFlow 2 graph execution (@tf.function, optional JIT) and a compact ResNet built with the Keras Functional API

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors