Social Networking
TypePad scores ... 8.5!
I've decided to keep the Typepad paid-for-account. In general, TypePad is very usable and is connected to a multitude of social services that allow an easy integration (like FeedBurner). I'm currently considering splitting the TypePad page into at least 2 seperate blogs — one for research/science/A.I. stuff and the other for more general issues. But nothing is decided except that I will seldomly post on this blog from now on.

Here's a link to my new blog.

-hthth
|
Trying out TypePad
For the next 30 days I'll be blogging at a Typepad trial-account I just set up.

Come on over to hthth.typepad.com and check it out.

-Hrafn Th.
|
Considering blogging services
I've been using RapidWeaver as a webdevelopment tool now, ever since I changed to look of my website and actually started blogging. However, there are some serious problems with it that continue to annoy me — such as:

  • Not being able to edit HTML code directly (aside from the basic theme)
  • Only being able to blog from home (e.g. no web-interface for blogging and my laptop is broken)
  • Some bad bugs started sprouting wings when I started looking into social networking and the main blogosphere (wrong titles showing up on Technorati search, etc..)
  • I have my suspicions that the generated HTML is not very SEF (search engine friendly)


So I've started to look for a blogging service — and looked at Typepad.com today. They charge for hosting, which is bad for a poor, poor student who just wants to do research and disregards other job offers. But, over these past few months I've grown to enjoy blogging pretty much. It enhances your writing skills and incites you to soak in more information from all around the world. So maybe I'll go for a paid server.

The interface of Typepad is quite nice, and you're able to customize most of the components on the site (direct access to HTML is only available to pro members, which is $14.95 a month). If anyone has any thoughts or recommendations I'd be glad to hear them.

To be continued....
|
On being first with the story ...
A few blogs ago I wrote about how I found a site where you're able to buy radioactive materials online for $69. This was in relation to Polonium 210 that killed Litvinenko, the former Russian spy in Britain.

So now, 3 days later — I see that CNET News and the San Francisco Chronicle & Information Week are finally catching up, talking about a site online that sells Polonium 210 for $69. Also, this blogger, this blogger and this blogger. Savvy I'm savvy! Although, Kathryn Cramer1 blows me out of the water explaining how polonium is made.

According to the latest news, it seems that P210 is the new anthrax: "Polonium-210, which has recently become well-known as one of the deadliest substances in the world". News-hype or reality? Note the entertaining play on words "one of the deadliest..." — on what scale? Really.

1: Originally, "Kathryn Cramer" read "this guy" — an accident which I blame on browsing too many blogs at the same time. Ms. Cramer was kind enough to leave me a comment informing me of my mistake. My sincerest apologies. Check out her blog — a really great mixture of entertainment and information.
|
Creating sitemaps and robots.txt files
Being unable to fall asleep again after waking up in the middle of the night is the perfect time to [insert your favorite activity here]. This time I decided to learn more about webcrawlers and search indexing. More specifically, Sitemaps and Robots.txt files, simple files that help make sure crawlers find what they're looking for — or don't find what they shouldn't. Here are a few sites that talk about this.

The robots.txt file is the greeting card for webcrawlers. When an indexing robot, say, from Google, Yahoo or some other search engine visits your website — they start by looking around for the robots.txt file. It's just a simple text file that can contain instructions for the robot — what to index, what not to index, etc... After looking at a few sites claiming that it never hurt to have it — I decided to find out the do's and don'ts of robots filemaking. Here is what I inserted into the robots.txt file I just created and dropped into my website's root directory:

User-agent: *

Disallow:

The User-agent part is for specifying what kind of robot the instructions are for (Googlebot, for example), the asterisk (*) means 'anyone'. Disallow is where you would specify a certain directory or file that the bots aren't allowed to index (if you don't want a certain file to be found). Example: Disallow: /mysecretstuff/. As I wanted everyone to index everything, I didn't put anything there. Simple enough, eh? Next time a robot comes around, he'll see the text file there and be happy — kind of like leaving cookies for santa.

Now, a Sitemap is an XML file containing a description of all the pages on your website. Again, this is for the robots — to help them index everything on your site and make sure they don't miss anything important. The file also contains some simple properties for your page, such as the update frequency (tells the webcrawlers how often they should visit), importance of particular webpages and so on. If you own a site — the many recommend a sitemap. Check out the links below to generate a map of your own site — then upload it to Google through the Webmaster tools.



|
Trademarks of Pseudoscience
I'm very reluctant and almost ashamed to even link to this article in the The Register, it doesn't deserve your attention, but I'll write about because of the relationship with my main field of research (Creativity / A.I.) — and because it's a prime example of pseudoscience in action.

This guy is apparently answering a recent interview with Stephen Hawking. In this interview, Hawking suggests that we must advance brain-computer interface technologies (connecting our brains directly to computers) so that artificial brains of the future contribute to the human intelligence rather than oppose it (interesting to hear that Hawking is thinking along these lines). The author of the rebuke, Thomas Greene, in a borderline barbaric manner, blatantly claims that Hawking is an idiot and that human-level A.I. will never be possible.

Now, I've had a saying for many years, which is: If you believe in the Theory of Evolution, you believe that the brain is a machine. Machines can be replicated, hence you believe that human intelligence can be replicated in machines.

Mr. Greene believes in evolution. However — he introduces an interesting (or not) twist: He believes that humans encompass a certain quality that machines can't acquire. He calls it "irrational insight" — which we (humans) "mainly exhibit in religion, art and literature". What he's referring to is creativity, more or less — and that computers are too logical to replicate this feature. His actual point is irrelevant and I'm not going to waste my time answering his pseudoscientific arguments (which he has more than a handful).

This dude exhibits and combines three commonplace intellectual fallacies, the trademarks of pseudoscience:

(1)
He assumes we know enough to know what we don't know. i.e. that human level intelligence can only be brought about by natural evolution, and not by any other process.

(2)
He takes a concept that we don't fully understand yet (e.g. insight, creativity, emotions) and announces that it's impossible to replicate. Even though he doesn't really know how it works or what made it come about.

(3)
He draws concrete assumptions about scientific unknowns in our world, venturing instantly into religious territories.

There's also a fourth, more annoying than interesting fallacy: most of the concepts he mentions are very ambiguous and ill-defined, communally obfuscating the pseudoscientific nature of his arguments. This makes the whole article very hard to counter-argue in a sensible manner, and regrettably will cause some poor souls to actually buy into it.

My last words are simply:
Beware pseudoscience.

|
Doomsday bunker in Germany to be opened the the public
Vineyard Picture
In 1960, the German government started building something gigantic at the pinot noir vineyards of Bad Neuenahr-Ahrweiler in Germany — yet, nothing is visible there but the vine. The structure is a $2.5 billion bunker: nearly 12 miles of tunnels, 936 bedrooms, 897 offices, and five small hospitals — all underground.

According to NY Times:
"It was designed to enable 3,000 people to survive for 30 days after an attack. Among those, presumably, would be the chancellor, who could get here quickly from Bonn, the former West German capital, 18 miles away."

Now that is something that I'd like to visit, and maybe I will. In 2008, they intend to open up a 656 foot stretch of it for the general public — but I really regret never being able to see the other 11.875 miles they're not going to restore. I've always been fascinated by large underground constructions, and I for one wouldn't oppose spending a few days exploring the tunnels, hospitals and bedrooms that lie dormant beneath the vineyards of Germany.

Check out the pictures and more info in the NYTimes article today.


|
Little Man's Euphoria
A very productive weekend (in terms of leisure). Here's the sequel to last entry's video, I call it "Little Man's Euphoria".




|
Please note: I strongly recommend not using Internet Explorer to view this page.