Things have been really hectic over the past few weeks but at least I've got time to breathe again. We presented our beta version of the Radio Station System project on Friday which went fairly well. Apart from one problem with reporting all went smoothly and everybody seemed impressed. It never ends though - so now I'm trying to implement dynamically changeable look and feels for the portal component of the system.
That brings me to my biggest problem with C# - Exception handling. Coming from a Java background, I've got so used to the compiler reminding me when I've forgotten to handle an exception somewhere. Now, don't get me wrong - I don't rely solely on the compiler to tell me when I've made a mistake, but it helps in some cases. Being fairly new to C#, there is no way that I know all the exceptions and where they could arise. So, inevitably I miss quite a few and the application (usually at the most inopportune time) bombs out as you're trying to show off some functionality that's taken months to build. Instead of getting some oohs, and aahs, you've got a red face and try to mumble something about the Memory....
From what I've seen, the developers at Microsoft seemed to try to make things as easy as possible with the good tools provided, frameworks, API's etc, whereas Java seems a little "lower-level". I'm sure that the guys had some reason not to build C# with forced exception handling but I can't seem to find it. It just seems contradictory to the assistance everywhere else?
"A 'Frankenrobot' with a biological brain" - They've built a robot that is controlled using a living brain. Amazing! Link
In our quest for maybe the ultimate goal in Computer Science, strong AI, they've wired a living brain to hardware sensors and . I don't know if this will be the start of the biological supercomputer as there are many ethical issues regarding this, but maybe it'll shed some light on the functioning of our own brain. It may also lead to some interesting discoveries or development in the field of Artificial neural networks. This isn't the first time this has been done though but is possibly the most advanced? Very interesting...
"machines will be capable, within twenty years, of doing any work a man can do"...
Herbert Simon 1965
So... I'd be vary careful about predicting the future within the Computer Science Realm.
"London - Spending on online ads overtook advertising on mainstream TV in Britain last year, growing 40% to £2.8bn and accounting for 19% of all advertising, UK regulator Ofcom said."
"Online advertising spending was dominated by paid-for search, in which sponsored links appear as internet search results. Paid-for search accounted for £1.6bn pounds, with the rest split equally between display and classified ads. "
"In its annual report on Britain's £51bn communications industry, the watchdog found that Britons spent four times as much time on computers, or 24 minutes a day, and twice as much time on cellphones in 2007 as in 2002. "
news24
There seem to be so many advantages to advertising on the net vs TV mass marketing I'm really not surprised that this has eventually happened. Not only can advertising be directed better (think simple Google Ads), but maybe more importantly, you can measure the effectiveness of your campaign to some degree. With some of the new online tools, you can objectively measure the impact of an advert. Calculating ROI from advertising spending is difficult but these tools at least make a step in the right direction.
Google has some awesome tools available and free for use. Firstly as most people know, there's Gmail - 6GB of space and growing, POP and SMTP access etc. Most importantly for me, unobtrusive advertising. Yahoo on the other hand has tons of irrelevant and annoying advertising. So, Google 1; Yahoo 0. Then there's iGoogle - a nice portal which can be used to house all your bookmarks, email, calendar (another nice product), RSS news feeds etc.
Something that I recently discovered while trying to submit my blog to Google's index was Google Webmaster Tools. You can host your websites with Google pages and use Google Webmaster tools to trace their indexing of your pages. It works by you embedding a randomly generated value within a meta tag on the default page of your site. This functions as an authentication mechanism which verifies you as the owner of the site. Once verified, you can view a number of stats collected while Google's crawls your site such as top search queries, what Googlebot sees, indexing stats subscriber stats etc. All in all, there's a bit of useful information there. Yahoo has a similar process of submitting pages except where Google's verification is instantaneous, Yahoo's takes 48 hours and doesn't seem to give you as much info. (My page hasn't been indexed as yet by Yahoo so I'm not really sure though as there is no data as yet.)
But, the really good one is Google Analytics. It's a tool to "help you learn even more about where your visitors come from and how they interact with your site. The new Google Analytics makes it easy to improve your results online. Write better ads, strengthen your marketing initiatives, and create higher-converting websites. Google Analytics is free to all advertisers, publishers, and site owners." (The first 5 million page hit statistics are recorded for free. I think that you have to pay thereafter.) Analytics works by embedding some Javascript in each page (or an include for dynamic pages) which downloads some other script which then sends data to Google on page load which is then stored. There's tons of data in reports to view - Browser type usage, page visits, unique visitors, benchmarking, average time on site, traffic sources and the geographical distribution of your visitors. Most of the data is presented on graphs, charts, maps or tables. This seems like a really good tool to maximize your site traffic and best of all - free.