# Graph theory

I don’t know a single example of an entire field being able to be reduced to one area of that field that works better than explicating all of computer science using graph theory. Granted, in some cases you have to stretch it a bit, but it works very well:

- Data structures: they’re all just graphs, in the end!
- Memory allocation and management: pointers are just graph edges.
- Parsing: Grammars are just graphs; parse trees, too; and parse forests can be very succinctly represented as graphs.
- Natural language processing: a parse tree, as mentioned before, is a graph; dependencies represent extra edges on that graph.
- Compilers: They just transform one graph into another. Dependencies? Oh, they’re just graphs.
- Networks: Are you kidding me? Graphs.
- Computational complexity: A program is a graph of instructions, so study the complexity of traversing that graph by following those instructions.
- Type theory: Operations and values are just vertices and edges. A type annotator is annotating a graph representing the program it’s working on.
- Category theory: I admit that I don’t understand much about this, but it looks very graph-y indeed.

So now when I’m having trouble understanding how to do something with computers, now I try and think about the problem in terms of a graph.

When quantum computing theory is better-understood, it will surely involve graphs. Maybe we’ll have superpositional edges and nodes which only exist when observed!