Question: How do you efficiently encode an arbitrarily large integer using bits? And to make things interesting, let’s say I can’t see how long your encoding is. I just start reading and that must tell me when I’m done reading; otherwise you could just write it in binary and we’d be done.