Open
Conversation
Adds test/new_features.test.ts covering every newly implemented feature with both forward-pass correctness and autograd checks: - torch.softmax / tensor.softmax / nn.Softmax: sum-to-1, numerical stability, gradient via Jacobian-vector product. - torch.clamp / torch.clip / tensor.clamp: boundary clamping, gradient passthrough / masking. - nn.LeakyReLU: positive passthrough, negative slope, gradient. - nn.MaxPool2d: 3-D and 4-D inputs, stride, padding, gradient scatter. - nn.Dropout: inverted scaling, zero output at p=1, no-op in eval mode, train() / eval() propagation through submodules. - nn.Flatten: default start_dim=1, custom dims, used in Sequential. - nn.NLLLoss: mean/sum/none reductions and input gradient. - torch.optim.Adagrad: basic update, lr_decay, weight_decay. - Module.train() / Module.eval(): recursive propagation, Dropout toggle.
FloatTensor extends Tensor and preserves float values unchanged — provided purely for PyTorch API compatibility. LongTensor extends Tensor and truncates all values toward zero on construction (matching Python int() semantics): LongTensor([-1.7]) gives tensor([-1]), not tensor([-2]). Both classes are exported from the top-level module and accessible as torch.FloatTensor / torch.LongTensor.
Adds Python-side wrappers for the two new typed tensor constructors: - FloatTensor(data, requires_grad=False): identical to Tensor; exists for PyTorch API compatibility. - LongTensor(data, requires_grad=False): truncates all values toward zero before creating the tensor, matching Python int() semantics. Both are exposed on the _Torch namespace as torch.FloatTensor and torch.LongTensor.
Dissolves new_features.test.ts and moves each test suite into the file
that matches the feature's category:
- functional.test.js <- torch.softmax, torch.clamp/clip
- nn_functional.test.js <- F.leaky_relu, F.max_pool2d, F.nll_loss
- module.test.js <- nn.LeakyReLU, MaxPool2d, Dropout, Softmax,
Flatten, and Module training-mode propagation
- loss.test.js <- nn.NLLLoss
- optimizers.test.js <- torch.optim.Adagrad
- typed_tensor.test.ts <- FloatTensor, LongTensor (new file)
There was a problem hiding this comment.
Pull request overview
This PR introduces Torch, a TypeScript ML library with a PyTorch-like API intended for Source Academy usage, including autograd, nn modules, optimizers, build targets (node/browser/cdn), and extensive tests/fixtures.
Changes:
- Adds core Torch runtime (
Tensor, autograd ops,nn,optim, grad-mode, event bus, PRNG). - Adds build tooling for node/browser/CDN and TypeScript declaration emission.
- Adds Mocha-based test suites (node + browser/UMD) and PyTorch-backed test generation scripts, plus Pyodide bridge examples and CI workflows.
Reviewed changes
Copilot reviewed 113 out of 128 changed files in this pull request and generated 12 comments.
Show a summary per file
| File | Description |
|---|---|
| vite.config.node.ts | Node library build config (ESM+CJS outputs). |
| vite.config.browser.ts | Browser ESM build config. |
| vite.config.cdn.ts | UMD/minified CDN build config. |
| tsconfig.json | Base TS config for dev/test tooling. |
| tsconfig.build.json | Type-only declaration build config. |
| package.json | Package metadata, exports map, build/test scripts. |
| eslint.config.mjs | ESLint configuration (TypeScript ESLint). |
| .prettierrc | Prettier formatting configuration. |
| .editorconfig | Editor formatting defaults. |
| .gitignore | Repo ignore rules (build, yarn, etc.). |
| README.md | Getting started + usage instructions. |
| CONTRIBUTING.md | Contributor docs and testing instructions. |
| NOTICE | Third-party licensing notices. |
| .github/workflows/build-test-publish.yml | CI: test, build, browser tests, docs deploy. |
| .github/workflows/pyodide-test.yml | CI: build + pyodide integration checks. |
| .github/workflows/verify-generated-tests.yml | CI: verify generated fixtures are in sync. |
| src/index.ts | Public API surface + re-exports. |
| src/tensor.ts | Tensor core (storage/views, ops, backward, typed tensors). |
| src/grad_mode.ts | Global grad enable/disable + no_grad. |
| src/util.ts | Global IDs + event bus/events + small utilities. |
| src/prng.ts | Seeded RNG utilities (uniform/normal). |
| src/broadcasting.ts | Broadcasting shape + unbroadcast helpers. |
| src/export.ts | Export/graph introspection support (ATen-style mapping). |
| src/creation/index.ts | Re-exports for creation helpers. |
| src/creation/initializers.ts | tensor, zeros/ones/full, *_like, etc. |
| src/creation/rand.ts | Random initializers (rand, randn, randperm, etc.). |
| src/creation/ranges.ts | Range helpers (linspace, arange). |
| src/functions/base.ts | Autograd function base classes + AccumulateGrad. |
| src/functions/registry.ts | Operation registration + factory/cache. |
| src/functions/util.ts | Strides/indexing + reduction-shape helpers. |
| src/functions/mixin.ts | Mixins for unary/binary/reduction op implementations. |
| src/functions/functional.ts | torch.* functional wrappers around ops. |
| src/functions/ops.ts | Concrete op implementations and backward rules. |
| src/nn/index.ts | torch.nn exports/initialization. |
| src/nn/base.ts | Module base + Sequential + parameter registration APIs. |
| src/nn/module.ts | Built-in modules (Linear/Conv/Dropout/Pooling/etc.). |
| src/nn/functional.ts | torch.nn.functional.* wrappers. |
| src/nn/loss.ts | Loss modules + reduction behavior. |
| src/nn/ops.ts | NN-specific operations (activations, losses, pooling, etc.). |
| src/nn/parameter.ts | Parameter implementation. |
| src/optim/base.ts | Optimizer base + zero_grad. |
| src/optim/optimizers.ts | SGD/Adam/Adagrad implementations. |
| src/optim/index.ts | torch.optim exports. |
| test/index.html | Browser ESM test runner page. |
| test/umd.html | Browser UMD test runner page. |
| test/torch-bridge.js | ESM bridge exposing window.torch as torch import. |
| test/chai-esm.js | Wraps global UMD chai into ESM export. |
| test/template.test.js | Basic Tensor smoke test. |
| test/backward.test.js | Autograd correctness tests. |
| test/broadcast.test.ts | Broadcasting shape unit tests. |
| test/broadcast.test.js | Broadcasting index/debug op tests. |
| test/functional.test.js | Functional API tests (add/mul/matmul/etc.). |
| test/nn_functional.test.js | nn.functional behavior tests. |
| test/module.test.js | nn.Module/layers behavior tests. |
| test/loss.test.js | Loss module tests. |
| test/grad_mode.test.js | no_grad / grad-mode semantics tests. |
| test/parameter.test.js | Parameter semantics tests. |
| test/optimizers.test.js | Optimizer behavior tests. |
| test/missing_args.test.ts | Default argument behavior tests (bias/reduction). |
| test/export.test.ts | Export mapping completeness test. |
| test/event_listener.test.js | Event bus dispatch tests. |
| test/custom_operations.test.js | Matmul/transpose correctness tests. |
| test/typed_tensor.test.ts | FloatTensor/LongTensor behavior tests. |
| scripts/requirements.txt | PyTorch+numpy requirements for fixture generation. |
| scripts/pyproject.toml | Black formatting config for scripts. |
| scripts/generate_tests.py | Generates JS fixtures from PyTorch ground truth. |
| scripts/generator/init.py | Scripts package marker. |
| scripts/generator/encoder.py | Compact JSON encoder for fixture output. |
| scripts/generator/unary.py | Unary op fixture generation. |
| scripts/generator/binary.py | Binary op fixture generation. |
| scripts/generator/broadcasting.py | Broadcasting fixture generation. |
| scripts/generator/matmul.py | Matmul fixture generation. |
| scripts/generator/reduction.py | Reduction fixture generation. |
| scripts/generator/linear.py | Linear fixture generation. |
| scripts/generator/optimizer.py | Optimizer fixture generation. |
| scripts/generator/expand.py | Expand fixture generation. |
| scripts/generator/conv.py | Conv fixture generation. |
| scripts/generator/export.py | Export fixture generation. |
| scripts/generator/loss.py | Loss fixture generation. |
| scripts/generator/activation.py | Activation fixture generation. |
| scripts/generator/cat.py | Cat fixture generation. |
| scripts/generator/softmax.py | Softmax fixture generation. |
| scripts/generator/clamp.py | Clamp fixture generation. |
| scripts/generator/maxpool.py | MaxPool fixture generation. |
| scripts/.gitignore | Python/script-specific ignore rules. |
| examples/index.html | Landing page for examples. |
| examples/browser/index.html | Browser demo using UMD bundle. |
| examples/basic_backpropagation.js | Node demo usage (basic backprop). |
| examples/pyodide/package.json | Pyodide example package config. |
| examples/pyodide/yarn.lock | Pyodide example lockfile. |
| examples/pyodide/main.js | Pyodide runner for cached-output comparisons. |
| examples/pyodide/index.html | Browser page running Pyodide + Torch bridge. |
| examples/pyodide/bridge.py | Python bridge for JS Torch in Pyodide. |
| examples/pyodide/py/basic_ops.py | Pyodide example: ops + autograd checks. |
| examples/pyodide/py/no_grad.py | Pyodide example: grad-mode checks. |
| examples/pyodide/py/linear_model.py | Pyodide example: Linear + losses + backward. |
| examples/pyodide/py/training_sgd.py | Pyodide example: training loop with SGD/Adam. |
| examples/pyodide/py/custom_module.py | Pyodide example: custom Module + training. |
| examples/pyodide/py/nn_module.py | Pyodide example: module subclassing/registration. |
| examples/pyodide/.gitignore | Pyodide example ignore rules. |
| .yarn/sdks/integrations.yml | Yarn SDK integration config. |
| .yarn/sdks/typescript/** | Yarn TypeScript SDK wrapper files. |
| .yarn/sdks/prettier/** | Yarn Prettier SDK wrapper files. |
| .yarn/sdks/eslint/** | Yarn ESLint SDK wrapper files. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Hi apologies, I am quite new to this codebase and am struggling to give a meaningful PR. I've added Copilot review as a safety net |
Author
|
Ah thanks, I'll fix those issues! |
Author
|
@loyaltypollution the issues raised by Copilot are all fixed |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Add Torch: a PyTorch-like machine learning library for Source Academy
This PR introduces Torch, a TypeScript implementation of a PyTorch-compatible ML library designed for educational use within the Source Academy platform.
Docs (also work in progress):
What's included
Core library (
src/)Tensorclass with automatic differentiation, broadcasting, and zero-copy views viaTensorStorageTorchFunctionsubclasses with correct forward and backward passesnn.Modulehierarchy:Linear,Conv1d/2d/3d,Sequential, and loss functions (MSELoss,BCELoss,CrossEntropyLoss)SGD(with momentum and Nesterov) andAdam(with AMSGrad) optimizersno_grad()context,export_()for ATen-compatible graph introspectionPython bridge (
examples/pyodide/bridge.py)no_grad()Testing
scripts/generate_tests.pyruns real PyTorch to produce numerical ground-truth fixturesBuild targets