diff --git a/Minecraft Datapacking/When Two Macros are Faster than One.md b/Minecraft Datapacking/When Two Macros are Faster than One.md index 53852cb..d958bf4 100644 --- a/Minecraft Datapacking/When Two Macros are Faster than One.md +++ b/Minecraft Datapacking/When Two Macros are Faster than One.md @@ -128,7 +128,7 @@ The `two_macro` function is *2.4x* faster than the `one_macro` function. What the heck is going on? How does *adding* an entire second macro function *improve* performance?? -It turns out that the clever and convenient `one_macro.array[string:$(keyword)]` triggers iteration to filter the array. Since the iteration is triggered by a macro, it directly runs Java code. It's still much faster than iterating in mcfunction, but the performance hit is O(n). In contrast, the `two_macro` approach directly accesses values by `key` and `index`. These operations have a performance hit of O(1). While I haven't tested it, this means that, when run on a larger dataset, the gap between `two_macro` and `one_macro` should continue to widen. +It turns out that the clever and convenient `one_macro.array[string:$(keyword)]` triggers iteration to filter the array. Since the iteration is triggered by a macro, it directly runs Java code. It's still much faster than iterating in mcfunction, but the performance hit is O(n). In contrast, the `two_macro` approach directly accesses values by `key` and `index`. These operations have a performance hit of O(1). This was confirmed by **Nicoder**. While I haven't tested it, this means that, when run on a larger dataset, the gap between `two_macro` and `one_macro` should continue to widen. # takeaways Indexing is cool. If you find yourself in a situation where you're working with moderate-to-large arrays and are able to index in advance of querying data, it's absolutely worth it from a query performance standpoint.