Skip to content
Closed
4 changes: 1 addition & 3 deletions .github/workflows/pages-deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,19 @@ name: "Build and Deploy"
on:
push:
branches:
- deployment
- main
paths-ignore:
- .gitignore
- README.md
- LICENSE

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

permissions:
contents: read
pages: write
id-token: write

# Allow one concurrent deployment
concurrency:
group: "pages"
cancel-in-progress: true
Expand Down
47 changes: 47 additions & 0 deletions _posts/2025-09-02-a-hypothetical-new-mathematical-field.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
---
title: A Hypothetical New Mathematical Field
date: 2025-09-02 17:00:00 +1100

categories: [Technical]
tags: [math,hott,computer-science,programming,dsa,dijkstra,time-complexity,optimization]
image:
path: /assets/img/homotopy_background.png
---

I am not a mathematicianby trade but I have always enjoyed it as a hobby, and for the longest time have wanted to invent my own math. The way inventing new math typically goes is someone works their way to the edge of human knowledge through the years of reasearch and study undergone during a masters degree and finally pushes that boundary back slightly with a PhD. I've long since stopped entertaining the thought that I would ever be able to invent new math. That hasn't stopped me from fantasizing about it. So, I would like to present a field of math I can see myself discovering in an alternative reality, out there somewhere.

## Background and Motivation

In computer science, perhaps the most famous algorithm studied by all undergraduate students is Dikstra's algorithm. Named after scientist Edsger Dijkstra, it is a shockingly simple solution to the Single Source Shortest Path (SSSP) problem prevalent in the real world.

Algorithms can be measured and compared using a method call complexity analysis. The resources an algorithm can use are time and memory (space) so we have time complexity and space complexity correspondingly. These measures do not have a standard unit e.g. metres or seconds. Complexity is determined by how the running time/space scales relative to the size of the input. If running time scales linearly relative to input size, we call it 'linear time'. If output scales quadratically realtive to input we call the algorithmic complexity 'quadratic'. If output is constant relative to input time we call it constant time. I imagine it is more 'accurate' to measure algorithmic efficiency by counting discrete steps or instructions- perhaps modelled by an algebraic object, rather than this imprecise notion of 'complexity'. But we’ll stay with time complexity for now. Returning to Dijkstra- strange news was doing rounds on linkedin only a few months ago. This longstanding giant was beaten! A new algorithm, also solving SSSP, was published with a lower time complexity.

For $$m$$ edges and $$n$$ vertices, instead of Dijkstra's $$O(m + n * log(n))$$, the new algorithm has $$O(m * log^{\frac{2}{3}}(n))$$. You can read the details of the implementation [here](https://arxiv.org/abs/2504.17033). It remains to be seen: since both algorithms solve the same problem but one is "better", is this the *most* optimal algorithm or "best" algorithm that can exist ever? Can we do better?

Here is a simpler example of an algorithm to optimize, with a definitive answer. Say you have a nonempty unsorted list of numbers. The task: to identify and return the largest number in the list. There is no way to do this apart from searching through the list linearly and keeping the maximum number saved in memory. This is just common sense. Although I can write less efficient algorithms with redundant steps, like comparing every element to every other element and then doing a binary elimination. The linear search will never be beaten; it is the algorithmic equivalent of a straight line. The question we are interested in is "what is required to prove this?" and more broadly "how can this be derived?"

Unfortunately it is a trivial consequence of Rice's theorem, which tells us that any non-trivial semantic property of a Turing Machine is undecidable, that no algorithm can ever tell is if another algorithm is optimal. A more in depth explanation can be found [here](https://mathoverflow.net/questions/381795/algorithmically-decide-if-an-algorithm-has-optimal-time-complexity). It seems a shame that such a promising, novel posit is quashed so mercilessly before it even got the chance to be explored. Let us pretend that the argument we just made were not the case and provide it this chance anyway. This is how I would go about exploring the problem.

![dijkstra](/assets/img/dijkstra.jpg)

## Basics of Optimization

In highschool we study calculus and learn that a function of a variable that varies continuously, e.g. $$f(x) = x^2 + 12x + 38$$, can be optimized to find a single, concrete, provably minimum value for $$x$$. Even though for a student that has not learnt calculus yet, the task of “optimization” may sound incredibly difficult as $$x$$ can be any value on the continuum- there are uncountably many options. It is easy to see how this could seem impossible. And yet for anyone that has completed highschool mathematics, it is trivial.

Then students may go on to study physics and learn that there is this concept of a functional- a way to assign a numeric quantity to a function such that we can extremize not a single function but a family of functions. For example, finding the function with the shortest path from the point $$(0, 0)$$ to the point $$(1, 0)$$ on the cartesian plane.

$$C([0, 1])$$, which is the set of all continuous functions on the interval $$[0, 1]$$ ($$f: [0, 1] \to \mathbb{R}$$), is the broadest set of functions we will consider. Of those, take the ones that satisfy the boundary conditions $$f(0) = f(1) = 0$$ as candidates. We can use the calculus of variations- beltrami identity/euler lagrange equation, to find the exact unique function we are interested in- and to prove it is a straight line.

What’s to stop us from taking a similar approach with algorithms, such as Dijkstra’s?

![optimization](/assets/img/optimization.png)

## My Proposition- Algorithmic Type Theory

SSSP can be formally defined. While there may be better preexisting mathematical frameworks suited for formally encoding constraints, type theory is an excellent starting point. Using techniques developed in HoTT and other fields connected to formal verification, we can define our data structures in a way that guarantees their properties e.g. a proof that a graph is connected is a type, a connected graph is its own type, a proof that a list is sorted is a type etc. We can create and pass around terms of these types, and we can create and pass around types that encode that functions or operations on data structures affect those data structures in measurable, formally verifiable ways. For (a simple) example, rather than adding an element to the end of a vector, we can add an element to the end of a vector and track the length as part of the type- a compile time constant that we can use in proofs rather than an unknown runtime quantity that may result in is performing an undefined operation on a data structure like popping an empty stack. These mistakes are the next generation of null pointer exceptions- if you have the technology, which we do, we should write our code to contain stronger compile time guarantees, the same way modern C# code never runs into `NullReferenceException` at runtime.

It is possible for us to define the requirements of an SSSP algorithm- that is, any algorithm that satisfies the SSSP contract, formally, as a type. Then any algorithm such as Dijkstra’s or Bellman-Ford that meets the requirements of SSSP is a term of that type. While these algorithms vary, for example Bellman-Ford has the additional benefit of working on graphs with negative weights, the formal contract wouldn't require it so both algorithms would be interchangeable terms of the type. The HoTT interpretation is that the formal contract is a type, which is a space and each algorithm is a point in that space, between which there can be paths representing equalities. That way if an algorithm has the steps "swap $a$ and $b$" and another has "swap $b$ and $a$" at a different point, yet it does not affect the functioning of the algorithm and the contract is still met, the natural symmetry of the two algorithms would be accounted for as a path between them in the space of interest, representing an equality.

There are a few interesting realizations to be had here- firstly that two algorithms that do exactly the same thing are actually connected. Many small steps can be made to transform one algorithm into the other. This transformation or "homotopy equivalence" is the foundation of the field of Homotopy when applied to spaces. And now we are doing the same to algorithms. Secondly, the idea that when adding an instruction to an algorithm, we change the algorithm not in a discrete way by a finite amount but in a continuous way by a nonstandard, infinitesmal amount. Another way of thinking about this is that addition of successive instructions can be thought of as successive compositions of generators of a lie algebra. Treating algorithms as continuous instead of discrete should allow calculus to be introduced, which finally gives us a foundation to optimize mechanically. Exactly what form such an algorithm will be represented in in order to be algebraically manipulated is not yet known by me- nor is the exact method of calculus that will be used.

I'm coining this imaginary field "Algorithmic Type Theory". While it was doomed from the start, I think these sorts of exercises and thought experiments have value across multiple levels. It is fun to think about, it is educational and realistic that known mathematical theories are rediscovered, and who knows- fooling around always has the potential to prompt a discovery, it has happened before and it will happen again.
46 changes: 46 additions & 0 deletions _posts/2025-09-12-on-design-and-normativity.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
title: On Design and Normativity
date: 2025-09-12 12:30:00 +1000

categories: [Misc]
tags: [society,programming,python]
image:
path: /assets/img/brass_gears_background.jpg
---

Do you know what an opinionated technology is? The topic came up naturally in conversation at work. It started with some first year cadets talking about projects they were working on before starting at WiseTech, and somehow it transformed into a comparison of various programming languages. But how can you compare programming languages? Fundamentally all languages that hobbyist and even professional software developers use have the same computational power. In fact, within the same programming paradigm all languages have more or less equivalent constructs and keywords that allow programs to be seamlessly translated between them. I argued, the real discussion to be had was about design and philosphy. Things like level of support, libraries, and expertise within a team are all important but are somewhat arbitrary across programming languages and what a dev finds mature, well-documented and easy to use is not inherently built into the language itself but is instead an artefact of marketing, culture and luck as the language is either eagerly adopted or silently neglected. In that sense it is impossible to discuss a langauge seriously as a theoretical, bounded object in a vacuum, instead it is a dynamic, growing tool that shapes the world and the broader community of programmers that use it shape it in turn. Trying hard to stick to the first (rigid and uncompromising) model of a language does not give us many easy handles for making comparisons and reverse engineering use cases- but it does concede a few. And so we began to talk about opinionatedness.

### Zen of Python

![Import this](/assets/img/import_this.png)

> "Python is an opinionated language"

What does that mean? Well to Year 9 me, it meant you could type `import this` into IDLE and you would be hit with this pretty nifty but somewhat abstract and cryptic poem making vague references to things I knew were programming-related but did not understand, like *Readability*, *Implicitness* and *Namespaces*. Now, I can confidently say that no programmer is entitled to make rules and expect them to be correct, understood or followed. And even then, rules should never be universal. Only siths deal in absolutes. Imo SWEs should humbly propose suggestions for programming rules to be critically examined and evaluated by the wider community. Following that thinking, *The Zen of Programming* is [nothing more than a neat piece of Python culture](https://www.reddit.com/r/programming/comments/9ga0m4/comment/e6310nw/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button). As we've seen time and time again with "Composition over Inheritance" and "Don't Repeat Yourself", we should be making rules of thumb not laws of nature! But that aside, Python's opinionatedness still stands.

"Opinionated" on one level simply means the language heavily encourages you to follow some rules when using it- to the point where it is easier to follow the language's rules than to not. By repeatedly following the rules your programs take on similar familiar shapes, in line with how "good" programs should look according to the language. Python wants your code to be readable. In languages with C style syntax e.g. C#, C++, Java, there are very few rules about where lines end, where lines begin, where brackets go and where semicolons must be. But Python wants your code to look good and be readable, so it has whitespace requirements. Famous idioms like list comprehension, while not strictly required, make your code neat and easy to understand. Next.js is an opinionated React framework. React gives you basic language features to build web pages, but developers still have the ability to choose various tools to solve common problems such as routing between pages, fetching data and controlling file layouts. Next.js makes these decisions for you by packaging a selection of libaries together to solve these problems- it has the "opinion" that complexities should be navigated a certain way.

```csharp
public static class Sample
{
public static int Add(int x, int y) {; ; return x + y; ; } }
```
*This kind of poor styling won't fly in Python*

## Use cases

Now I want to abstract one layer away. What we are calling "opinion" is very similar to the notion of a "use case"- thsee concepts are dual. A use case is where the language says "you may use me here" and the user then gets to decide how. An opinion is where the user initiates and says "I want to use you" and the language then gets to decide how. Programming languages become first order functions and "act" on a domain of use cases which in the case of Python includes ML, data science, scripting, web dev etc. Use cases extend beyond technologies that are considered classically opinionated since use cases are relevant anywhere in the wider realm of design, and everything is designed. Design is an abstract term, I will define it as "all conscious or unconscious thought by the implementer of a system that precedes implementation of the system." In other words, any thinking that comes before the making. Naturally when we design things we have visions of how we want them to be used, visions that *can* be formalized through requirements or shared through documentation and tutorials- but are inevitably formed and iterated on as the world embraces a technology and makes it their own.

Programming languages aren't the only technologies that have been designed- consider a hammer. A hammer is opinionated, or dually has defined use cases. Although you can use it to massage yourself or crack open pistachios, and it is totally reasonable to buy a hammer intending to use it for either of these things, the hammer maker and the hammer seller would expect you to use it to put nails into wood.

## Beyond Technology

Let us abstract away once more. What other objects are first order functions and "act"?

'System' is about as abstract a word as there is. I will define it as "a process that does something"- in a mathematical sense it is a process, considered as an object, that changes global state. It does something, it makes a measurable change on the surrounding context. All human-created systems are designed, all human-created systems have use cases. And if you want to get philosophical, there is an argument to be made that natural systems do too. Society is a system, that acts on people who in turn modify society. It is dynamic, as a programming language is. The implementer is us. Society is opinionated because it is built with a certain type of person in mind, an "ideal" person that uses society exactly how it was intended to be used to get the most out of it and give the most back to it. And I, having lived in society for 20 years, can reverse engineer this person. He isn't vegetarian- why would he be, and choose not to be able to eat most food at most restaurants? He is straight, as his wedding form has a bride section and a groom section. He is white and unaffected by institutionalized racism that may have slowed him down. We call this opinionatedness "normativity".

Society has observable emergent opinions, since its opinions are built from those of the subsystems within it- but there are rules and nuances to get the most out of society, to "use it better", that nobody seems to have defined yet are undeniably there. People in positions of power don't like to admit that you need to be rich to get the most out of society- yet, I wouldn't be able to work the job I do now without parents that were well off enough to support me through school. People propagate 'The American Dream' yet we have cold hard numbers- in this case social mobility indices- that directly refute it.

Many valuable people, ideas and skills are illegible to systems such as academia or bureaucracy. To gain access to resources you often have to translate yourself into a legible form, such as degrees, past jobs, or portfolios. People respond to signals- signalling confidence and competence sometimes matters more than skill, such as in a corporate environment. Humaan brains categorize for efficiency and so people tend to be seen as projections of the roles they represent (e.g. Engineer, Woman, Immigrant) rather than themselves. Networks open more doors than talent. Society has layers of exclusivity from high end restaurants to private jets, none of which were invented by any one person (they are products of society like everything else). What makes university "for the common man" and private jets not? Is health care any more fundamental than tertiary education? There are various types of privilege and [some are harder to measure than others](https://www.rnz.co.nz/news/the-wireless/373065/the-pencilsword-on-a-plate).

Of course none of these ideas are groundbreaking or novel. But identifying and analyzing patterns is a great first step in the process of helping people that are not living productive, happy, fulfilled lives by fixing the broken system they are in. [Realizing that there is a system is the first step to being able to exit it, and then reason about it from the outside](https://www.physixfan.com/wp-content/files/GEBen.pdf#page=45). Let the next generation of systems engineers put on their thinking caps, let them look at society the same way they look at a failed machine and ask "Why?".
Binary file added assets/img/brass_gears_background.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/img/dijkstra.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/img/homotopy_background.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/img/import_this.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/img/optimization.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.