if it ain't broke: fix it till it is.
By anders pearson 23 Nov 2000
while everyone else in the US was gorging themselves on tryptophan today, i was hunched over my keyboard eating triscuits, listening to Disturbed, and birthing the newest incarnation of /dev/random.
it’s not much to look at yet as i’ve decided to eschew fancy design for the time-being until i get everything working just the way i want it to.
but there are lots of fun new toys. obviously bookmarks and sketches are now integrated in what i hope is a nicer way than before. unfortunately i have to repost the sketches by hand so it may take me a few days before i get them all up in the new section.
bookmarks should be fun. i’m hoping that if the other seeders all post their bookmarks up on here, we’ll have sort of a nice little micro search engine of our own consisting entirely of links that were hand-selected to be useful and interesting.
there is a little search box now too. it works. it doesn’t do anything fancy and it’s one of the areas that i’m working on improving.
and perhaps you noticed the “random post” above by “markov” and you’re wondering what the hell the deal is with that? markov is the statistical hive mind of the seeders. specifically, markov is a second-order markov chain algorithm as applied to the total text of all the posts in the database. a first order markov chain would store a matrix of the transition probabilities for all the words in the text that we are looking at. then, a random word from the text would be selected, it looks up the possible transitions for that word (eg, if we randomly selected the word “the”, the matrix might show that statistically the word “the” is followed by “cat” 20% of the time and “dog” 80% of the time so we pick one of those probabalistically). then, that word is looked up, a new word is picked based on the transition matrix and the process continues. our markov chain is 2nd order though. that means that instead of looking at a matrix of the transitions from one word to another it looks at a sequence of two words transitioning to another (eg, instead of “the” transitioning to “dog” or “cat” we have “the dog” transitioning to “runs” or “barks”. then we randomly pick “runs” and we repeat the process with “dog runs”, etc). the net result of all this is that you can generate cool random texts that tend to almost make some sense and can be quite amusing. markov is still pretty stupid and can’t do basic things like starting and finishing sentences when you would expect, but i’m working on that.
so enjoy the new toys and let me know if you find any bugs (i wouldn’t be surprised) or can think of anything that would make it even nicer. i’ll be posting the new source code once i have a chance to pretty it up a bit.
and, once things are settled down a bit, i’ll be putting together a shiny new design.