# hckrnws

The AB patterns in the article remind me a bit of the sequences which occur in “The anti-pattern game” – a game I invented for a course I took in modal logic. The game produces sequences where no pattern repeats more than twice in a row:

● ● ○ ● ● ○ ● ○ ● ● ○ ● ● ○ ○ ● ● ○ ● ● ○ ● ○ ● ● ○ ● ● ○ ○ ● ● ○ ● ● ○ ● ○ ● ● ○ ● ○ ○ ● ● ○ ● ● ○⋯

I put a brief description up on it here:

It feels like this sequence should be related to the Thue–Morse sequence https://en.wikipedia.org/wiki/Thue–Morse_sequence - sometimes called the 'fair share' sequence

○ ● ● ○ ● ○ ○ ● ● ○ ○ ● ○ ● ● ○ ● ○ ○ ● ○ ● ● ○ ○ ● ● ○ ● ○ ○ ● ⋯

I don't think this contains any 'anti pattern game' losing sequences, but I may be missing something.

Wouldn't a simple version of this be ● ○ ● ○ ○ ● ○ ○ ○ ● ... etc?

I was just thinking the same thing, but it only accounts for repeating patterns that start from the beginning. If you look at your example, you can already see ○ ○ ○ repeats the pattern ○ three times in a row.

Oooooh, wow! Good point & makes things much harder! Will read OPs post in more depth later then.

Edit: I wonder if you can use property based testing to prove this fact...

Comment was deleted :(

Title should be "Math *Patterns* That Go On Forever but Never Repeat.

The article title is correct, but the article page's HTML title is wrong / SEO

I have a question for serious math nerds: are there degrees of never repeating? Like if you think of repetitiousness like escape velocity, as you get less repetitious the number of digits between repeats increases until it hits infinity, aka escape velocity. Can repetitiousness decrease even beyond that?

There is Kolmogorov complexity, which is (relative to a programming language) a measure of how easy a sequence is to produce algorithmically – i.e. what is the length of the shortest program which can produce the sequence. Most mathematicians believe that not all sequences can be produced algorithmically, so this leaves some sequence too complicated to have a finite Kolmogorov complexity. But you can go on to the infinite by adding oracles, and start assigning infinite ordinals to sequences, depending on how many oracles you need for a given sequence. If a program uses an oracle for the halting problem, say, you nominally add ω to its length.

All the repeating sequences have finite Kolmogorov complexity. So, if you want, you can say that the sequences with infinite Kolmogorov complexity have even lower repetitiousness than the ones we can produce algorithmically. And this will go on and on through the countable ordinals.

> Most mathematicians believe that not all sequences can be produced algorithmically, so this leaves some sequence too complicated to have a finite Kolmogorov complexity.

Isn't this just true?

The set of algorithms is countable, and the set of sequences of real numbers is uncountable.

IANAMathematician, but it's "true" in the sense that we primarily use a set of axioms (ZF/ZFC) which produce this result. But in a sense one of the axioms is that these things exist (powerset axiom), so this result is a bit less illuminating than one might hope.

My understanding is that we can reason about these sorts of non-constructible sequences but we can't really know if they "really exist" because it's not clear what that even means if their existence has no implications for physical reality (which is possible but not known). So now we're into the territory of whether there is such a thing as abstract truth independent of reality.

I wrote a blog post in the past about my reading on this topic, and I can't claim that's it's accurate but at least I tried to talk about this subject: https://blog.kevmod.com/2022/04/09/do-the-real-numbers-exist...

It depends on believing in classical logic. Constructive mathematicians can instead choose to adopt intuitionistic logic and postulate Church thesis, which states that all functions, and thus sequences, are computable.

*The set of algorithms is countable*

That is assuming finite algorithms. I’m curious if infinite algorithms exist and/or considered in maths.

The internet says that an algorithm is finite by definition, but it’s hard to search for “infinite algorithm” because everyone talks about termination instead.

Aren’t self-modifying programs a potential implementation of an infinite algorithm?

I'm not clear on the definition of sequence such that you could have one without having a description of how to produce it? Can you have a sequence of random numbers?

They are necessarily non-constructive existence proofs, for sure. You're not alone in some potential unease with that: https://en.wikipedia.org/wiki/Constructivism_(philosophy_of_...

Any representation of any irrational number will be an endless and never repeating sequence of seemingly random digits.

I think the term of interest here is non-computable: irrational numbers can have finite kolmogorov complexity.

You're kind of "begging the question" here, where you're assuming that non-computable numbers exist and then using that to show that some numbers are non-computable. You can definitely show that these things exist, but that relies on "believing" the set of axioms that you used to prove it.

How do we identify a specific non-computable number?

Comment was deleted :(

One way of looking at things like this is from the perspective of so called Ramsey theory. Notions like syndenticity(the property of a set having bounded gaps), upper density(the lim-sup of the proportion of numbers in the set), thickness(the property of having arbitrarily large successive subsequences), etc. all capture different notions of this.

Edit: let’s not forget the rich theory involved in examining if sets/sequences contain arithmetic (or geometric) progressions! The well known Green-Tao theorem about primes containing arbitrarily long arithmetic progressions is one such result. In fact, the real Green-Tao theorem says that any set which has nonzero upper density with respect to the set of primes contains arbitrarily long arithmetic progressions.

Comment was deleted :(

True randomness with infinite sequences should generate infinitely repeating sequences and infinitely non-repeating sequences.

I’m not sure I agree. Can you be more precise about what you mean by repeating? If you mean “contains an infinite arithmetic progression” then surely this is not true.

Could you also be more precise about what you mean by true randomness?

Cantor liked this post.

So one thing I that this aperiodic monotile had me wondering, does it have any non-trivial *non-viable partial tiling*? By which I mean you start laying down tiles right, and it looks like they fit together fine. Your construct has no holes and is "reasonably convex", like nowhere does it curve inwards by a lot. But because of some mysterious long-range self interaction it's actually impossible to complete the tiling process into something encompassing the whole plane.

Depends on your definition of non-trivial and reasonably convex.

After playing with the tiles a little bit it seems to me that placing 4 (maybe even 3) hats on top of each other is one such configuration. You can place 1-2 more tiles on each side, but you quickly endup with state where the dead end is obvious.

As a kid I did a thing with my fingers where I'd start with this simple pattern (mapped to some finger motions):

DOWN UP UP DOWN

This is the UP-pattern. The DOWN-pattern is UP DOWN DOWN UP.

Then I'd do it again, but replacing every DOWN with the DOWN-pattern and every UP with the UP-pattern.

Then I'd try to do it again for *that* pattern. I don't remember how deep I could get. It wasn't very far! But I loved doing it.

Decades later I met someone who had done the exact same thing as a kid, though he called it LEFT RIGHT and moved his tongue around to do it, rather than tapping it out with his fingers.

Technically I ALSO did it left to right, I just didn't think of it that way. For UP-pattern I'd tap, or really roll/drum, left-to-right, index-middle-ring-pinky. And DOWN-pattern was tapped right to left.

I also did this as a kid.

As an adult I discovered it's called the Thue-Morse sequence.

It has a lot of fascinating properties that I think you'll be as delighted to learn about as I was.

I had that also, compulsively running through it in my head or tapping of fingers or gnashing my teeth in the pattern, then there was a time with a feaver dream.. some crazy thoughts!

Interesting topic, terrible (IMO) title.

Did the original paper come out before they realized "the hat is actually one of an infinite family of aperiodic tiles"? Because I feel like the team really missed out on a PR opportunity by going with a "hat" over an APERIODIC MONOTURTLE.

But it's not really one tile, if you have to use it and a mirror, is it? Also, the final image where the two tiles are shown with 4 colours makes it confusing trying to see which is which.

Now, where can I buy enough actual tiles in this shape to make my bathroom floor an absolutely unique thing to behold?

I thought this was describing the "number line" of natural numbers

i = 0

while True:

```
i += 1
```

Also never repeats..except for the forever part

ints(8-128) will either wrap around (repeating) or will overflow error in any language I've ever used

Python uses BigInts, so it will go until it runs out of RAM, I believe.

Imagine a computer lasting long enough to count to a number larger than what fits in 8GB of RAM...

Let's see...

The age of the universe in Planck time requires 200 bits to represent as an integer.

The number of Planck cubes in the universe can be represented with 611 bits.

1 KB can represent an integer as large as 2^8000 = 10^2408. 10^2408 Planck times is 10^2347 x (age of universe). The last of the stars will burn out in 10^73 Planck times from now.

Hey this is a math fight, no computer architecture allowed!

Yes, if I knew the mathematical notation for the same thing I’d have used that. Overflow is not the point..

Crafted by Rajat

Source Code