sharick.xyz Home Blog Projects Directory Misc Contact

Advent of Code

Technologies: Python, Git, Shell

Advent of Code

Advent of Code is a yearly Advent calendar of programming puzzles that's run since 2015, with 25 two-part problems each year, each part rewarding one "star". I completed my first two years, 2021 and 2022, during their respective Decembers, and decided after that to go back through and complete the previous six years, all in pure Python and with no libraries outside of the standard library used. If you just want to see the code, it's available here:
2021 · 2022 · 2020 · 2019 · 2018 · 2017 · 2016 · 2015 · personal library.
Read on for my thoughts on each year, and a selection of my favorite problems (usually 3 per year), based on my current feelings and my memories of what the problems were like. These will, of course, contain spoilers. so beware if you haven't completed the puzzles yet and plan to.

Utilities Library

In addition to actually solving the problems, I built up a personal library of utility functions to handle commonly needed operations, which is something I recommend that anyone who wants to go for 400 (or soon 450) stars should do. Having a comprehensive library can vastly shorten puzzles that require common operations (such as shortest paths on a graph), and help even in problems it doesn't trivialize (such as parsing common input patterns). The current components of my library are:

Scripting

To speed up the process of starting a puzzle, I wrote some scripts to automate the process. The make-folder script creates a folder for the day's puzzle and puts a skeleton for the code in it, the download-input script saves the input to a file, and the run-all script is a convenience to run every problem's code in sequence. The skeleton itself includes all of the libraries I usually need, a dummy parse-input function, and a main function that can run either unit tests or the actual code, depending on the command line arguments, as well as time the code.


2021 [December 2021] · AoC Website · GitLab

This was the first year of AoC I attempted (I was considering doing 2020 at the time, but ultimately didn't), and so it incorporates some of the oldest forms of my problem-solving workflow, including the lack of any kind of utilities library, and a general lack of experience with the problems.

Day 8: Seven Segment Search (puzzle, code)

This was the first especially unique problem I encountered, and therefore the first that really showed me what an interesting AoC problem could look like. This puzzle is based on seven-segment displays, which can display a digit 0-9 by lighting seven line segments, but for which the letters corresponding to each segment are random for each display. Each display contains two parts: all ten digits rendered in the display's arrangement of line segments, in a random order, and then a four-digit code. The goal is to use the data from the first part to correlate the letters to their segments, and thereby decode the four-digit codes to solve the puzzle.

To solve it, I ended up with a solution of iteratively narrowing down the possibilities for each segment. Since the number one is the only number with two segments, I can narrow those two segments down to only two possibilities; from there, the number seven has only one segment not in the number one, so that segment can be exactly deduced. Two more segments can be narrowed down by comparing the number 1 and 4, and this kind of logic was what I used to get the complete mapping of segments on one line. With this mapping, I could convert the last four digits into numbers, and sum them all up to solve the puzzle.

Day 19: Beacon Scanner (puzzle, code)

This was the first especially long or difficult problem I did, taking just over 3.5 hours for part 1 and 5 hours in total, while nothing else this year even crossed the 2-hour mark. This problem presents around 30 "scanners" capable of detecting "beacons" within a range of 500 in each direction, with the complication that each scanner has different origins and basis vectors, and they need to be correlated using the beacon data in order to solve both parts of the problem.

Because of the complexity of this problem, I didn't fully understand my solution at the time (leading to some inefficiencies), and still don't fully understand it. I found overlaps between different scanners' scans in the form of 12 shared beacons, accounting for the different orientations by applying 48 different basis vectors to the second beacon (although only 24 are valid, I didn't and don't know which 24). By applying this procedure to all pairs of scanners, I could find and remove overlaps in the scanners' lists, and thereby solve part 1. Part 2, from here, required me to place each scanner in a unified coordinate space; since I was already tracking the offsets between the pairs I detected, I could recursively put all of them in the same coordinate space, at which point calculating the maximum distance between any pair is trivial.

Day 22: Reactor Reboot (puzzle, code)

Day 19 was the first especially difficult puzzle I encountered, but this one was the second, and compared to the first I think my solution is much cleverer and more elegant. In this problem, there's a set of steps in which cuboids in 3D space have every cell in them set to either on or off, and the goal is to calculate how many cells are on at the end of every step; for part 1, this is only the cube defined by (-50, -50, -50) and (50, 50, 50), while for part 2 it's the entire infinite 3D space.

My initial approach for part 1 was the naive one: just keep a 101 by 101 by 101 3D array representing the valid space, and toggle the values of the array for each step. This worked fine for that part, but was obviously completely infeasible for part 2; I needed a new approach. To solve this, I worked with cuboids that could be split arbitrarily, allowing me to symbolically represent the areas toggled by each step, and the areas remaining after said step. In order to apply a step, I split every existing cuboid into 27 pieces based on their relation to the new step's cuboid (though I think it's possible in as few as 7), and then kept the ones that didn't overlap and actually existed. By applying this process to the entire list of cuboids, and then removing any overlaps, I could apply the complete step, after which I simply summed up the volumes.


2022 [December 2022] · AoC Website · GitLab

This was the second year I did AoC, and the year that motivated me to go back and finish all the previous years. Despite having more experience than last year, I actually found this year more difficult, and a couple problems (days 16 and 19) I wasn't able to complete within 24 hours, and ended up doing them after December 25th. This year also saw the beginning of my utilities library, which I would later separate out and expand on.

Day 7: No Space Left On Device (puzzle, code)

This is a fun problem involving a file directory tree, but with an interesting way to present the input: a series of Unix ls and cd commands to traverse the directory tree and list all the files contained in it and their sizes. The goal is to find the total size of all small directories (those with a size under 100,000), and then to find the smallest directory to delete that will leave the total space used under 40,000,000.

This problem is largely a problem of input parsing, and is one of the first problems of 2022 with major potential pitfalls, namely failing to distinguish different folders with the same name at different places in the file tree. To avoid this in my solution, I tracked the current directory as a string while moving through the input, and either added new directories or removed the deepest one when seeing cd commands. The files are stored as a dictionary from canonical filename -> filesize, so I then calculated the size of each directory by summing the sizes of all files beginning with that directory's name, and from there solving both parts was easy.

Day 11: Monkey in the Middle (puzzle, code)

This is an example of a number theory problem, focused on modular arithmetic and modular representations of data. In this puzzle there are some monkeys (eight in my input) which each start with some items, represented as a number for their "worry level". Each monkey will add, multiply, or square the worry level of its items, divide them by three if in part 1, and then throw it to one of two other monkeys depending on whether the worry level is divisible by a certain prime number. The goal is to find the total number of times each monkey threw an item, and multiply the largest two monkeys' counts, after some number of rounds.

For part 1, I just implemented a naive simulation of the monkeys' logic, which was workable for that part because of the small number of rounds and the division by 3 that kept the numbers under control. For part 2, I needed to use modular arithmetic to keep the numbers down instead. Because the only thing the real worry level is needed for is to check for divisibility with the monkeys' numbers, I found the product of all those numbers and, after each monkey's operation, took the worry level modulo that product, preserving divisibility without allowing the values to grow indefinitely large. With this, my solution runs in about 1/3 of a second.

Day 17: Pyroclastic Flow (puzzle, code)

This is the somewhat well known "tetris problem" (somewhat of a misnomer, because some of the shapes are actually pentominoes). There's a seven unit wide and infinitely tall tunnel, in which the five types of [tetra/pent]ominoes, flavored as rocks, fall in a repeating pattern, and the puzzle input is a repeating pattern of jets which are effectively a left or right input on the figurative D-pad for the game. The goal is to simulate a large number of rocks falling, and determine the total height of the rock tower at the end.

As a scaling-type problem, my initial naive solution for part 1 proved drastically insufficient for part 2. For part 1, I simulated the exact shape of the entire rock formation using a 2D array, increasing its height with empty rows as necessary and simulating the rock's motion as it falls until landing. Although this could easily handle about 2,000 rocks, it's nowhere near capable of trillions, so I needed a new approach for part 2. That approach was cycle detection: find at what point the rock pattern begins looping, and then with a bit of modular arithmetic, I could easily calculate the height at an arbitrary point in the future. To do this, I kept the same rock simulation logic, but saved the current relative heights of each column in a dictionary along with the current point on both the rocks' and jets' cycles, and ran the simulation until this dictionary contained a duplicate key; from there, the arithmetic was simple.


2020 [December 2022 - January 2023] · AoC Website · GitLab

Just after I finished up 2022, I began working through the backlog with 2020, and finished it in just over a week during my winter break. In addition to completing the puzzles, I added some helper functions to my utilities whenever I thought they might be useful.

Day 17: Conway Cubes (puzzle, code)

This problem is quite simple in and of itself, but I like it because it's a great example of how useful properly generalized library functions can be. The challenge is to run a simple cellular automaton identical to Conway's Game of Life, except in three or four dimensions, and calculate the number of living cells after six cycles.

For this problem, I used an arbitrary-dimension get_neighbors function to calculate the neighbors of each cell, making it trivial to move from part 1 to part 2 (it took 90 seconds between submitting the two correct answers). The actual code uses a set as a sparse grid, containing the positions of each lit cell without needing to handle constantly resizing an array, and uses a Counter to track how many lit cells each cell neighbors; this is then transformed into the new set of lit cells after a tick, which can be easily looped to find the answer.

Day 18: Operation Order (puzzle, code)

This is one of the most unique problems in all of AoC, and as a result has some of the most unique solutions of any AoC problem. It's very simple conceptually: just evaluate a list of arithmetic expressions containing addition, multiplication, and parentheses, and sum their results, but with the caveat that the operator precedence between addition and multiplication is changed (either so they have the same precedence, or so that addition comes first).

Of the many ways to solve this problem, I settled on (ab)using Python's AST module to solve it for me. By replacing the multiplication signs with subtractions and parsing the tree into an AST, I had the correct order with the wrong operations; from there, I walked the tree, replacing ast.Sub nodes with ast.Mult to create the correct expression, and then evaluated it. For part 2, I extended the part 1 changes by also replacing, and then reverting, additions into divisions, swapping the

Day 20: Jurassic Jigsaw (puzzle, code)

This is a behemoth of a puzzle, requiring some extremely complex logic to fit everything together. The puzzle presents 144 tiles, each 10 by 10 squares of bits, representing scrambled portions of a cohesive image; they need to first be reassembled into the complete image, and then that complete image needs to be searched for a "sea monster" pattern (possibly requiring flipping or rotating the image), and finally the number of active spots on the image not part of any sea monster needs to be counted.

For part 1, by chance I was able to find a shortcut to avoid assembling the entire image immediately. I represented each tile edge as a 10-bit binary number (ranging from 0 to 1023), treating hashes as 1-bits and dots as 0-bits; I also reversed the bit order for the edges to account for flipping them, meaning each tile had 8 edge IDs in total. From here, the corner tiles can be distinguished by the fact that exactly four of their edge IDs (the edges pointing outwards from the complete image) aren't present anywhere else.

Part 2, however, requires assembling the entire image, using some not-so-pretty logic: first, collect all the current corner and edge tiles (edge tiles, similar to corner tiles, have exactly two unmatched edges), then arbitrarily pick a corner as "top left". Rotate the tile so it actually is top left, then pick tiles with matching edge IDs from the edge tiles to create the top row until reaching the top right corner. Repeat the same process on the other three sides to create a single "ring" of the final image, and then, treating that ring as if it didn't exist, find the corners and edges of the next ring going inwards. For these later rings, I also had to rotate them so that they'd match the ring around them. Once this process was finally complete, I checked each orientation of the image for the one that did contain sea monsters, and then saved their positions so I could count the number of hashes not part of any sea monster, finally solving the puzzle.


2019 [January 2023 - April 2023] · AoC Website · GitLab

2019 is the infamous Intcode year: roughly half of its problems involve writing a simulator for an assembly-like language called Intcode, and then running the puzzle input through that simulator in different ways. Intcode is polarizing, but I was a big fan of the concept: it tied together puzzles a lot more than you get in most years, and by building on Intcode, there were a lot of very unique puzzle concepts that the AoC team was able to create. In addition, I finally separated out the utilities into their own repository, meaning I could more easily update them and track changes to them. This year spanned the longest of any year, because it coincided with a very demanding semester that left me without much time to work on AoC. This was also the year I split the utilities into their own repository, rather than just being part of the year's repository.

Day 7: Amplification Circuit (puzzle, code)

I'm a big fan of this puzzle because it showcases some of the really unique ways to use Intcode in puzzles that the concept makes possible. In this problem, there's five different amplifiers (Intcode computers), which communicate with each other to run a program to calculate the value to send to the ship's thrusters; the goal is to reorder the five computers to find the highest possible thruster signal, and in part 2, to add a continuous loop to the ordering of the computers.

Although this is a very interesting problem, it's still early in the month and hence quite easy to solve. In order to test each ordering, I looped through the arrangements with Python's itertools.permutations function and used a simple for loop to create the computers and run them in sequence. For part 2, all I needed to change was to separate the creation of the computers from their running, and then run them in a while loop rather than a for loop.

Day 14: Space Stoichiometry (puzzle, code)

This is an interesting problem that led me to create an entirely new utility library, frac.py, thanks to its combination of fractional math and tree-shaped data. The puzzle presents a list of formulas that look like 10 ORE => 10 A, 7 A, 1 E => 1 FUEL, or 7 DCFZ, 7 PSHF => 2 XJWVT, and part 1 asks how much ORE is needed to produce 1 FUEL; part 2 then asks how much FUEL can be produced with one trillion ORE.

To solve part 1, I used a topological sort to order the chemicals with FUEL first and ORE last, and then walk the tree in that order, calculating the total amounts of each chemical needed for the one FUEL using some modular arithmetic. Part 2 was considerably more complicated, because I had to account for the excess chemical produced by most reactions, so I wrote the Fraction library, representing fractions as a class with an integer numerator and denominator. With this, I could represent the exact amount of each chemical needed to produce one FUEL, and preserve this exactness all the way up to ORE, at which point I easily calculated the solution.

Day 20: Donut Maze (puzzle, code)

This puzzle is a great example of the pattern of reducing to a graph search that can be applied to a surprising number of problems. It presents a donut-shaped maze with an entrance labeled AA, an exit labeled ZZ, and then portals which have a corresponding location on the inside and outside of the maze. In part 1, these portals merely link the two locations, but in part 2 the maze is recursive, and a portal on the inside of the maze actually connnects to an outside portal one level deeper; the exit only exists on the first level. The aim in both parts is to find the shortest path from the entrance to the exit.

This puzzle is infamous for having one of the worst input parsing experiences of any puzzle, because of its two-dimensional labels for the portals which can be both horizontal and vertical; my goal was to reduce the labels to a single cell in the grid, which required incredibly gnarly logic to handle. Once this was done, I converted the grid into a graph with my graph_from_grid library function, connected the two sides of the portals manually, and then used simplify_graph to reduce to only the important nodes before finding the total path length. For part 2, I extended the parsing by adding "-I" or "-O" to the end of each portal, representing the inside or outside respectively, and then created 30 clones of the graph at different depth levels and linked them together (the number 30 being arbitrary, but it works). With that, I could run the pathfinding again in order to find the new shortest path.

Day 22: Slam Shuffle (puzzle, code)

This problem has one of my all-time favorite solutions, because of all the different CS concepts it combines together. In this problem, you have a deck of N cards numbered 0 through N - 1 (for part 2, the interesting part, N is about 119 trillion), initially sorted, and are given a shuffling procedure consisting of dealing, reversing, and cutting the deck repeatedly. For part 2, the goal is to find what number will be on the card in the 2020th position after about 101 trillion shuffles.

There were a lot of insights I needed to use to solve the problem. The first was that I could represent each shuffling operation as either multiplication, addition, or both modulo the length of the deck, and therefore could compose them, and represent the shuffle as a whole as an equation of the form y = (mx + b) % n. From there, I got from Reddit a trick to calculate the number of shuffles needed using the fact that after n - 1 shuffles, the original order is restored. To actually perform this many shuffles (roughly 20 trillion), I again used the composability of the modular shuffle function, calculating 2 applications, 4 applications, etc. in the same y = (mx + b) % n form. With these powers of 2, I represented the total number of shuffles in binary and composed the appropriate binary digit functions in order to obtain the answer.


2018 [April 2023 - May 2023] · AoC Website · GitLab

This year was fairly smooth, running from the time between semesters where I didn't have much occupying my time up through the beginning of the summer semester.

Day 20: A Regular Map (puzzle, code)

This problem is a fairly simple standard graph problem (or, really, tree problem) with an unconventional way of defining the graph: a regular expression where every string that matches the expression is a valid path, and every string that doesn't isn't. The hard part is converting this expression into the more useful form of an actual graph, and once that's done the puzzle is quite easy: all it requires is to calculate the furthest node from the start, and the number of nodes that are at least 1,000 steps away.

In order to construct the graph from the input, I used the pattern of recursively initializing a class from a string, which I've used for similar tree-based problems in the past. The Node class (structured as a tree, with a root node and children flowing one direction) is initialized first, and then converted into nodes and edges to construct my standard Graph class. From there, I used Dijkstra's algorithm to calculate the distance from the start to every reachable location, at which point I could easily find both the maximum distance and the number of locations whose distance is at least 1,000.

Day 21: Chronal Conversion (puzzle, code)

This puzzle is an example of the assembly reverse engineering puzzles that exist across multiple years, and in this case is also the culmination of two previous problems focusing on the same assembly-like language. The goal is to find the optimal starting value for one of the registers to either minimize or maximize the number of instructions the program will run.

As part of solving this problem, just like with other reverse engineering problems, I began by transforming the opaque assembly code into a more familiar language (in this case, Python) and introducing higher-level constructs like loops. The notes I took on the process are somewhat able to document how I did this. Once it was fully converted into more useful Python, I could repeatedly run the code until finding duplicates in the ending values of register 3, at which point the last value the register held was also the answer to part 2. Part 1, on the other hand, was manually calculated in a way I don't remember how I did it.

Day 23: Experimental Emergency Teleportation (puzzle, code)

This problem is infamously one of the most difficult in all of AoC, because of how intractable naive solutions to it are, and because of how non-obvious the working solutions are. Although part 1 is fairly easy, part 2 is not: it asks: given a thousand "nanobots" with a position in 3D space (almost always eight digits long) and a range, which 3D coordinate is in range of the largest number of nanobots?

After some help from the subreddit, I settled on a solution using https://en.wikipedia.org/wiki/Octree to partition the 3D space, and perform a 3D analogue of binary search to find the desired coordinate. Starting with the "center" at the origin (0, 0, 0), I partitioned the entire space into eight cubes sharing the center as a corner, and found the cube with the most nanobots intersecting it somewhere (which required some hideously complex logic to determine). Once I found this cube, I partitioned it from its center again, and repeated the process until I was down to a 1x1x1-sized cube, which fortunately was the correct answer. My description here understates the intersection logic, which is the real meat of the solution, so I'd recommend reading it if you want a fuller understanding of the complexity.


2017 [June 2023 - July 2023] · AoC Website · GitLab

This year had no significant stumbling blocks for me, taking just over a week of time in between the end of the summer semester and the beginning of my internship.

Day 9: Stream Processing (puzzle, code)

This puzzle presents a stream of characters containing groups, delimited with {}, and garbage, delimited with <>. Groups can contain any number of both other groups and garbage, and garbage can only contain random characters, but garbage can also have characters in it be canceled by !, including the ending delimiter. Part 1 asks for the "score" of the stream, where every group has a score equal to how deep in the tree it is, and part 2 asks for the total number of characters in all of the garbage combined.

Similar to problems like 2018 Day 20, I solved this using a recursively initialized tree-like data structure, but this time I worked entirely within that structure. Although I generally don't use object-oriented patterns in AoC, this puzzle was an exception, where I put the entire code inside the Group and Garbage classes. Each class initializes itself from a string, passes the substring to create child nodes, and then skips over them to continue initializing. Once I had these classes, I could easily implement the calculations for both score and garbage size as recursive methods on the classes, traversing the tree to get the answer.

Day 14: Disk Defragmentation (puzzle, code)

This problem also takes advantage of my graph library to simplify the logic, but unlike most cases, I don't use it for pathfinding or searching of any kind. In this problem, a custom cryptographic hash called a knot hash is used to define a 128 by 128 grid, using the bits of one hash to determine "used" and "free" squares on a single row. Part 1 asks for the total number of "used" squares, and part 2 for the total number of connected regions of "used" squares (which, for anyone familiar with graph algorithms, might ring a bell).

Part 1 was easy with my existing knot hash implementation, because I could just take that and use Python's built in int.bit_count (a popcount equivalent) on each row to get the number of 1 bits and hence "used" squares. For part 2, I began by using the knot hash to create the grid as a 2D array of integer 0s and 1s, using the string representation as a convenience in this process, and then used the graph library to convert that grid into an actual graph. With that, simply calculating the number of connected components allowed me to get the answer for part 2.

Day 21: Fractal Art (puzzle, code)

This puzzle involves an "image", represented as a square grid of binary pixel values, that needs to be repeatedly enhanced by following certain rules. The image is divided into either 2x2 or 3x3 squares, depending on which it's divisible by, and depending on the pattern each square matches, it is replaced by a larger pattern (either 3x3 or 4x4), producing a larger image. The goal is simply to determine how many of the pixels will be on after some number of steps of the enhancement process.

In my solution (like, I'd imagine, most solutions), the process of enhancing the image once has three steps: split it into smaller squares, convert those squares by matching them to a rule, and then reassemble the larger image. To handle the flipped and rotated forms of the patterns, I defined inline functions for the three types of flips (horizontal, vertical, and both), and used the Python rotation trick of list(zip(*reversed(list))), which rotates a 2D array 90 degrees counterclockwise. By composing them, I got all the possible forms of each pattern, and could then check each sub-square against it to complete the image enhancement. Although this algorithm is slow (taking just under 4 minutes for part 2), it does work, and I don't generally focus too much on speed in AoC as long as I can solve the puzzles.


2016 [August 2023] · AoC Website · GitLab

This was another quick year, being completed over the course of two weekends.

Day 17: Two Steps Forward (puzzle, code)

This puzzle, in addition to continuing the MD5 hash theme of many of the other 2016 puzzles, also presents an interesting type of maze. This maze is only 4x4 in size, but the doors between adjacent cells will open or close depending on the path that's already been taken, using an MD5 hash of the path to decide whether the doors are open. Because of this, it's impossible to effectively map out the maze ahead of time, and it has to instead be actively explored to find both the shortest and longest path to the end.

Although the concept is unusual, the problem itself can be solved with standard maze search algorithms. For the first part, I implemented a simple breadth-first search, which is guaranteed to find the shortest possible path, by keeping track of both the position and existing path in the queue of positions. For part 2, I continued to use a breadth-first search, but instead of returning when reaching the maze's end, I instead kept track of the longest seen path and updated it when reaching the end. I also implemented a check to ensure I didn't visit any paths multiple times, to guard against looping forever.

Day 20: Firewall Rules (puzzle, code)

This puzzle is fairly simple by the standards of final-week puzzles, but its simplicity also makes it an elegant and unique problem with a very clean solution. The problem presents a blacklist of IP addresses defined by ranges, such as 7892-9012, and asks for both the lowest non-blocked IP address and the number of non-blocked IP addresses.

To solve it, I drew on my ranges library to detect overlaps between the blacklisted ranges, and then merge all of them into as few ranges as possible. By then sorting them, I could ascend the list of ranges until finding a gap in the blacklist, at which point I had the answer to part 1. For part 2, instead of just taking the first gap in the blacklist, I looked at each gap, and counted up the total length of the gaps to get the final answer.

Day 22: Grid Computing (puzzle, code)

This is a more complex variation on the sliding tile puzzles you might be familiar with as a way to kill time. In this problem, there's a grid of nodes that store data, some of which have a small amount, some have a large amount, and one is empty. The goal is to find the path to get the tile in the top right to the top left, by swapping the empty node with the small nodes; the large nodes act as impassable walls.

Part 1 (which just requires finding pairs of nodes where one node's data can fit in the other node) was very simple to solve: all I did was compare each pair of nodes to find the ones that fit the viability criteria. Part 2 (the actual slide puzzle), on the other hand, was more complicated and involved a degree of manual work. I rendered the grid of nodes using text so I could visualize the path the data would need to take to reach its destination, which turned out to be very simple, requiring only two turns. Using that knowledge, I could calculate the length of each segment of the path, and then easily calculate the total cost of the path.


2015 [November 2023] · AoC Website · GitLab

As the first year of AoC, as well as the last one I completed, this year largely contains simple puzzles I was able to rapidly complete as I raced to finish before the December 1st deadline, which I was able to do without much difficulty.

Day 19: Medicine for Rudolph (puzzle, code)

THis problem is generally agreed to be the hardest problem of 2015. It provides a list of rules for transforming one atom into multiple (such as H => CRnAlAr or P => PTi) and a target molecule that's a few hundred atoms long, and asks what the minimum number of applications to transform a single electron to the target molecule is.

My solution for this puzzle is actually very unsatisfying. At first, I attempted to solve it by analyzing the input, and noticed a pattern with the Rn and Ar atoms in the rules, which I used to parse the molecule into a tree. From there, however, I couldn't figure out how to continue, and looked at the subreddit for hints. I saw some hints that pointed towards how to build on what I'd already done, but I also saw some mentioning a randomized solution, which inspired me to try just greedily matching every rule against the molecule, which shockingly actually worked. Although not a satisfying solution, it does work.

Day 22: Wizard Simulator 20XX (puzzle, code)

In this problem, you have to simulate a turn-based battle between the player and a boss: the player has five spells that can be cast using various amounts of mana, which can damage the boss, recover HP or mana, and protect from attack; the boss deals a fixed amount of damage each turn. The goal is to figure out the smallest possible amount of mana the player can spend and still win, and then to do the same in "hard mode", which adds an additional 1 HP loss per turn.

For part 1, I implemented a breadth-first search through the tree of all possible spell orders, using combat simulation to tell which spells were valid and when combat ended. Although this worked, it was very poorly written (most notably, rather than simulating player and enemy turns separately, it attempted to simulate them at the same time with some extremely fragile and repetitive logic), which meant that the part 2 solution I expected to work didn't. I solved this by committing the software development cardinal sin: I fully rewrote the code, which despite my joke was fine because of how short the code actually is. With my better understanding of what the code needed to actually look like, the rewritten version was cleaner, shorter, and worked on both part 1 and 2 after some testing.