Innovation »

  • Self-driving cars and the metropolis of the future

    May 28, 2012 @ 12:34 pm | by Davin O'Dwyer

    Passing a driving test is a memorable rite of passage for most people, but when Google received its driving permit in Nevada early in May, it was a turning point that might have considerably greater implications for the future of our species – and not just in terms of transport.

    Google, of course, has been developing autonomous self-driving cars for a few years now, customising a small fleet of vehicles with sensors and cameras and all sorts of computer modelling equipment. There can be no doubt at this stage, self-driving cars are the future of the automobiles, and the only question left is the timescale of their mainstream adoption – think decades rather than years.

    But the effects of autonomously driven cars go beyond being able to text behind the wheel, as some thought-provoking discussions have lately highlighted – the very fabric of our urban spaces will gradually be transformed with the rise of self-driving vehicles.

    Slate’s Matt Yglesias made some interesting points about how the whole model of car ownership is likely to change, for a start – they will be more akin to automated taxis than robotic possessions. And that will have massive implications for the provision of car parking spaces:

    “By contrast, right now every metropolitan area in the United States contains many, many more parking spaces than automobiles. When you’re at work, the space allocated for your vehicle at home sits there empty. When you’re at home, the space allocated for your vehicle at the office sits empty. Malls build parking to accommodate demand during peak hours, and the spaces mostly sit empty off-peak. But if the cars could drive around without a human pilot, there’d be no need for such lavish supplies of vehicle storage…After exploding for about 60 years, the torrent of parking construction is going to halt very suddenly and then start shifting into reverse.”

    And the decline of parking will be just the beginning of the transformative effects self-driving cars might have on urban design. Yglesias’ comments prompted Timothy B Lee at Forbes to speculate on other potential effects. He suggests such as greater housing density around suburban transit stations, higher road density as most self-driving cars won’t need to be as big as most current automobiles, and the complete reinvention of public transport, with self-driving vans replacing buses and potentially light rail services. However, Lee suggests that the impact will be unevenly distributed, with different effects on small and large cities:

    “In smaller metro areas, self-driving cars will likely make recently-built light rail systems look even more like white elephants, as the falling cost of taxi service and the reduction in congestion causes many rail customers to switch to them…On the other hand, in larger metro areas the emergence of affordable taxi service may actually increase subway ridership, as more suburban residents take a taxi to their local subway stop and ride to work in the central business district. Indeed, the greater efficiency of self-driving transportation has the potential to dramatically increase the size and density of our largest cities.”

    Given the low-density sprawl that typifies Dublin and, to a degree, other Irish cities, and the high reliance on individual vehicles among commuters, it’s an interesting exercise to speculate on what effect self-driving cars will have here. The size of Dublin’s commuter belt and the concentration of so many vehicles on our roads at rush hour means that self-driving cars are unlikely to efficiently act as a fleet of automated taxis, but the glaring inadequacies of our transport system will undoubtedly be improved by the advance of this technology. How it will change the face of our urban spaces is, at the very least, fun to think about.

  • Spectacles vs Watches – Battle Commences

    May 8, 2012 @ 2:35 pm | by Davin O'Dwyer

    In yesterday’s Innovation Talk column, I admitted that I’m a bit sceptical about the potential for smart glasses, despite the rather cool Google Project Glass video – I’m just not convinced that they represent the future of mobile computing.

    I am convinced of one thing, however: they are not going to act as a replacement for a smartphone, not immediately and, I’d wager, not ever. People are not going to wear Google glasses all the time – as a glasses wearer, trust me, they just won’t – so users are going to carry phones with them for when they’re not wearing the glasses. And if you’re carrying a phone for those occasions you don’t want to wear the glasses – when you’re not in the mood for the dorky look, say – why bother with the glasses? The phone is going to be far more useful than the glasses for a whole range of apps and tasks, so the specs are likely going to be an auxiliary device, and thus remain a niche gimmick.

    (Plus, as tech writer Tom Davenport put it to me last week, imagine how difficult it will be to convince people that having a cellular antenna strapped to your head for 12 or so hours every day poses no health implications? It’s a Daily Mail scare story waiting to happen.)

    I thought a similar fate might doom smart watches – your phone can do more, so why bother with another gadget – until I read MG Siegler’s take on the keenly anticipated Pebble watch, which interacts with iPhones and Android phones:

    “Will Apple make a wrist device? I don’t know. But they should at the very least be thinking about it.
    All I know is that at least 50 times a day I reach in my pocket to see why my phone just buzzed. A new email? A DM? An iMessage? Some sports score alert? Instagram? Path? Facebook? Foursquare?
    I reach into my pocket, pull out my iPhone, turn on the screen, see the notification, then turn off the screen, then put the phone back in my pocket.
    Imagine if I could just look at my wrist?”

    This makes a lot more sense – the ease of low-friction smartphone monitoring without the, you know, stupid glasses. As an auxiliary device, the wristwatch has a lot of advantages over a geeked-out pair of glasses, in that it’s always viewable, without necessarily being intrusive.

    Admittedly, the notion of needing an auxiliary device to augment a smartphone might seem the height of nerdiness to most people, but there are plenty of consumers out there lusting after the Pebble – judging by this Guardian report from Charles Arthur, Pebble is looking like a monster hit, even before its Kickstarter funding drive finishes up in a few weeks.

    Clearly, mobile computing is about to get a lot more personal – wearable computers are on their way. But when it comes to ubiquitous, on-body computing devices, I reckon the e-ink wristwatch is going to trump the smart-spectacles. Of course, just as inevitably, the incidence of people being distracted by notifications popping up on their wrist device is going to become the new scourge of real-world human interaction. How ever did Dick Tracy overcome that problem?

  • The mysterious invention of email

    February 27, 2012 @ 1:16 pm | by Davin O'Dwyer
    Who invented email? You might think it was a pretty basic question, but a recent fiasco suggest that it’s a lot more complicated than one might think. It all started with the Smithsonian museum’s acquisition  of “tapes, documentation, copyrights, and over 50,000 lines of code” from a New Jersey programmer called VA Shiva Ayyadurai, relating to an early messaging system he designed. Coverage of the acquisition, such as this Washington Post article, unambiguously called Ayyadurai the inventor of email.
    But it seems that Ayyadurai’s claim is, well, contentious, to say the least. As soon as news of the Smithsonian’s purchase [See Thomas Haigh's comment below, it was a donation rather than a purchase] became public, a chorus of internet historians began to object, insisting that Ayyadurai didn’t, in fact, invent email.
    Ayyadurai’s claim rests on the fact that he developed a messaging program back in 1978, when he was only 14. By 1982, he had managed to copyright the word “email”. That’s not the same as inventing the technology, needless to say.
    The truth is that it’s very difficult to determine who invented email – it’s just words being sent over a network after all. There was some consensus that the first such communication took place in the early days of Arpanet, where a programmer by the name of Ray Tomlinson developed a messaging protocol in 1971 – he describes that process on his suitably retro-looking website.
    The brouhaha illustrates how our assumptions about the process of invention are pretty outdated – the notion of sole inventors toiling away on breakthroughs just isn’t applicable any more. With many technologies, there can be nothing so cut and dried as a single “author” – it’s an incremental process of improvement, and Tomlinson, for instance, spreads the credit. Tim Berners Lee is another prominent example, and even he doesn’t claim that he invented the worldwide web singlehandedly in a vacuum. We like the narrative purity of thinking that certain people are responsible for certain things, but it’s more than a little simplistic to think that’s how the process of invention happens.
    In a statement clarifying the motives behind the acquisition, the Smithsonian acknowledged that reality: “Many innovations are conceived independently in different settings.” But there can only be so much independence in these areas – standing on the shoulders of giants, or at least other programmers, is an inherently dependent process. Perhaps the answer to the question “Who invented email?” isn’t, ultimately, a person, but a process.
  • Clearing the way to the future

    February 20, 2012 @ 9:35 am | by Davin O'Dwyer

    Clear app, by Realmac Software, Impending Studios and Milen Dzhumerov

    The tech blogosphere has been buzzing all week about Clear, an innovative iPhone to-do app from  Realmac Software, Impending Studios and London-based developer Milen Dzhumerov that was released last week to great acclaim (available here for a limited-time introduction price of 79c ). It’s not an exaggeration to suggest it is the most-hyped app of the year, as far as these things go, with some great reflections on Clear’s gesture-laden user interface floating around.
    The headline on Matthew Panzarino’s piece on The Next Web, How a simple list app called Clear may change how we use our devices forever, captured the mood – Clear is being widely hailed as a landmark in user-interface design, marking a comprehensive break from the conventions of decades of computer use in favour of something far more, well, touchy-feely.
    Instead of relying on the usual array of buttons, arrows and calendars that characterise the competition, Clear only offers simple lists, navigated by touchscreen gestures such as swipes and pinches, with varying colours and sounds offering feedback. It does need to be used, or at least seen, to be appreciated, and this video captures some of Clear’s magic.
    The interface is minimal in the extreme – the user is faced with nothing but the items to do. The interaction comes not through buttons and menus and visible clues, but purely through touching and manipulating the items. Slide to the side to mark as done or delete, pinch to move up a layer, slide down to create a new item, and so on. The colour-coded prioritisation is a visually intuitive touch, and the sounds are nicely game-like.
    But as John Pavlus points out on FastCo Design, Clear tramples all over established usability principles.
    “Interaction-design greybeards like Donald Norman and Jakob Nielsen would say Clear’s gestural UI breaks two fundamental rules: ‘Visibility (also called perceived affordances or signifiers)’ and ‘Discoverability: All operations can be discovered by systematic exploration of menus.’”
    Of course, as designer Francisco Inchauste points out, the usability conventions we’re just getting used to and settling on – pinching and swiping and sliding and the like – will be as natural to someone who grows up using touchscreen devices as pointing and clicking with a mouse feels to us. Already, there’s plenty of amusing evidence that children used to playing with iPads think that magazines are broken, and I’ve lost count of the number of times I’ve tried to select words on my laptop screen by tapping them.
    In the words of Aynne Valencia and Alfred Lui of international design consultancy Fjord:
    “User interface designers are beginning to realize that there is no longer the need to hang on to representations of real life objects and drag them into the digital space. Digital is something else. It’s magical. It affords the user magical powers. It is no longer the user, a mouse, and a complicated ballet of hand eye coordination…Clear’s focus on gestural UI bestows this sense of magic by escaping the traditional paradigm of check boxes and text inputs that normally exist with digital to-do lists.”
    Up to now, my to-do app of choice was a pretty feature-rich app called 2Do, which syncs across all my devices and has no end of customisation and endless settings and reminders and location awareness and what have you. But there was always an uncomfortable friction while entering stuff that needed to get done – whereas Clear is akin to writing down a shopping list, 2Do sometimes veered too close to filling out a health insurance form. Where Clear is lithe and responsive, 2Do often felt cumbersome, and 2Do is by no means the most convoluted to-do app around – the task manager of choice in geek circles is the fearsome Omnifocus, the Adobe Photoshop of “Getting Things Done”, with nearly as many tools and settings.
    Obviously, Clear does a lot less than 2Do – it doesn’t populate my calendar with deadlines, alert me with reminders or sync with my MacBook. But despite that reduced feature set, I’m already using Clear far more than 2Do, because it’s far more delightful and efficient. Its focus and design, and its focus on design, ultimately makes it a more useful app.
    It’s not without flaws, and I’m not sure how such fluid, intuitive user interface paradigms can scale to more complex apps, but every time I open it to add to a list or mark another item as done, it certainly feels like the future.
  • Launching the Publishing Renaissance

    January 27, 2012 @ 4:10 pm | by Davin O'Dwyer

    My interview with the US economist and renowned Marginal Revolution blogger Alex Tabarrok is in today’s Innovation magazine, and can be read here. Tabarrok’s new ebook is called Launching the Innovation Renaissance, and it’s a sharp, brisk read that tackles some of the urgent issues facing the US in terms of encouraging and sustaining an innovation economy, from patent law reform to immigration policies to overhauling education at both second and third level.
    Tabarrok discusses these issues in the interview, but the book itself is a good example of the disruptive innovation going on in the publishing industry. Released as an ebook by TED books, the publishing arm of the famed “ideas conference”, it points to a trend in publishing that I think is going to become increasingly pronounced as more and more readers go digital – non-fiction books in particular are going to become shorter and will be produced on much tighter schedules. The increasing popularity of Amazon Singles is another sign of how the long-form essay and the short-form book are meeting in the middle in digital form – in many ways, it’s the return of the sort of pamphlet that for so long allowed for the quick exchange and formation of ideas.
    Tabarrok says that his experience in the blogosphere shaped his new book in a fundamental way. “The first thing you learn when you’re blogging is that people are one click away from leaving you,” he says. “So you’ve got to get to the point, you can’t waste people’s time, you’ve got to give them some value for their limited attention span. And I’ve tried to write the book in that light.”
    Launching is certainly a concise read, covering a lot of ground very concisely, and it illustrates how policy-oriented material aimed at a mainstream audience can benefit from the sort of focus the format encourages – a previous example was The Great Stagnation by Tabarrok’s fellow Marginal Revolutionary Tyler Cowen (in  some respects, Launching can be seen as a companion piece to Cowen’s ebook).
    The rise of the short-form ebook also serves to differentiate the material from academic-oriented writing – a number of books in the social sciences run the risk of attempting to cater to both academic and mainstream audiences, ultimately pleasing neither.
    “It’s difficult in academic work, there is a different set of virtues that are appreciated, sometimes it can be a little bit tricky going back and forth,” says Tabarrok. “Writing on the blog, you want to get attention and make strong claims. In academic work, that often doesn’t pay, so sometimes it’s a little bit difficult going back and forth to navigate these differences.”
    In any case, expect this form of innovation to disrupt traditional publishing in a big way over the next few years.
  • Good design, bad design and thermostats

    November 25, 2011 @ 3:31 pm | by Davin O'Dwyer

    Nest smart thermostat

    We tend to know good design when we see it, but we seem to be less astute when it comes to rejecting bad design. How else can we explain the almost universally atrocious usability of the domestic thermostat? How many hours have been lost fiddling with teeny dials and clicking inscrutable switches? And how much energy has been wasted due to incorrect settings? It might seem like a mundane piece of the average household scenery, but if, say, bath taps were as badly designed, we would only have a vague idea how much water was going to come out, never mind how hot it would be.

    Resolving these critical design flaws was the unlikely project taken on by Tony Fadell, one of the originators of the iPod and the former senior vice president of Apple’s iPod division. Fadell explained that “Thermostats looked like PCs from the 90s: square, beige, nothing innovative, and very expensive.” In other words, ripe for reinvention.

    In an interview with Steven Levy in Wired, he admitted to the scepticism people expressed when he told them his next project was going to be the reinvention of the humble thermostat. I doubt they were as sceptical when the result was finally unveiled a few weeks ago – the Nest thermostat instantly generated huge amounts of attention and, indeed, lust from bloggers and tech writers. As TechCrunch’s MG Siegler put it, “It makes me want to buy a home just so I use it.” Its first production run sold out within 72 hours.

    The Nest is certainly the product of excellent design, and not just because it is so aesthetically pleasing, especially in comparison to its rivals, though the futuristic dial-design and pleasing typography and graphics certainly contribute to the appeal. It might look good, but Fadell also knew it needed to be intuitive and easy to use. The Nest’s good design extends to its five sensors, measuring temperature, humidity, light and activity, detecting when people are home and altering the temperature accordingly. Its good design can be seen in the way it learns from previous adjustments, predicting when to raise and lower the temperature, self-adjusting when necessary. Its good design includes the inbuilt Wifi that allows it to be controlled by smartphone apps, so users can adjust their homes’ temperatures remotely, or remind it that the house will be empty for a few days. The prospect of a “smart” domestic appliance, doing the thinking for you, learning and remembering and applying its knowledge, has just become real.

    Above all, Fadell was inspired by the goal of reducing our energy waste – the real cost of bad design wasn’t just an unsightly gadget on the wall, it was all the oil and electricity that was frittered away because people couldn’t or wouldn’t properly engage with the ugly gadget on the wall. As a piece of greentech, it demonstrates that not all the innovation needs to be on the energy-production side – users need to play a part too. And as an example of how well-designed technology can improve our lives, Nest is a small but telling pointer to the future.

  • The Elevator Pitch

    October 21, 2011 @ 11:43 am | by Michael McAleer

    So what are you up to these days? It’s a simple question, asked of grandchildren the world over. After years of practice we should be well able to explain ourselves in a relatively short and snappy way. Or so you’d think.

    The problem with being a hack these days is that you spend so much time bound to a desk or sitting through supposedly insightful presentations, that are big on Powerpoint but small on detail. Quick snappy opening lines are often followed up with rather ropey explanations of the detail.

    Yet it underlines the power of good presentation skills and how, while avoiding the pitfalls of PR spoof, the art of clarity is so important for any future innovator.

    There’s no escaping the need to stand up and be counted these days. Whether it’s to potential investors or customers – or even a bunch of hacks – there’s an often ignored talent in comprehensively and clearly presenting your ideas. Get it right and the good times may well roll. Get it wrong and a world of bitterness and “what-if” awaits.

    It’s the talent that made Steve Jobs the doyen of the media for his showmanship, but also undoubtedly lured investors to his projects. Meanwhile, the bars of the world are filled with innovators who could have changed the world if only they could better explain the benefits of their big ideas.

    I had the pleasure of judging one of the qualifier rounds for the Thesis in Three competition this week. Organised by Clarity, the SFI-funded centre for sensory web technologies, it’s a prelude to a final event in the Mansion House on November 9th, where finalists from the nation’s third-level research centres will be asked to explain what it is they are up to in three minutes, with the usual presentation crutch of three Powerpoint slides for support.

    For some of us three minutes is an awfully long time to explain that we spend our days typing and staying awake at presentations, but if you’re currently spending your time manipulating thermo-responsive gels or debugging and monitoring wireless sensor networks, then it’s not that easy to boil it down so that people like me can understand. And just for the added challenge, try presenting it in front of your peers in English, when that’s not your native tongue.

    I’m told that the elevator pitch has become a regular feature of research PhDs, much to the chagrin of many of the students, who see their life’s work as more about breakthroughs and solving problems rather than slick presentations. Yet if you want your research to be recognised and appreciated, there’s no reward for hiding it away behind complex jargon or waiting for others to put in the legwork to discover just how great your achievements are.

    The presentations at this week’s event concentrated on the realm of sensors, with several researchers focussing on the environmental area, reflecting perhaps the strong interest from the commercial sector in better sensors to monitor energy usage, pollution, and the like. Similarly there were several presentations on improving internet search and recommendation tools.

    The winners on the night were Steven Bourke, Jogile Kuklyte and Ken Conroy. All three will go on to present at the Mansion House event. They should do well: their projects are interesting, innovative and they explain them with clarity, identifying the problems at present and how their work might resolve them.

    There was an impressive array of talent on display, a demonstration of the innovation in action in our third-level institutions, much of which goes unnoticed by the general public. It’s important for several reasons that this starts to change. First, from an economic point of view, taxpayers need to be aware that their money is being invested in useful and cutting edge developments. Second, the researchers need to be able to get their message across succinctly and capture the attention of the potential investors. Like it or not, at some point a PhD with an ambition to see the research deliver to its potential is going to have to seek support. That means joining in the pitch battles to catch the attention of over-run venture capitalists or angel investors.

    It might seem a distracting sideline to gather in a small room above a Dublin pub and seek to explain what you’re up to in three minutes to the likes of me, but practice makes perfect and the three who go forward have the talent to take their ideas beyond the laboratory and into the real world. I believe one or two of them already have. The Thesis in Three give these Phd students great practice for the future, and that’s partly what education is all about.

  • Farewell to Steve

    October 6, 2011 @ 3:30 pm | by Davin O'Dwyer

    Late last night, I fired up my iPad for a final glance at my RSS feeds before hitting the sack – excellent, a new post from John Gruber at Daring Fireball. Gruber is the most perceptive Apple writer around, and he’s always worth reading, no matter how brief his posts.
    And this one was very brief. “Damn,” was all it said. The headline was only slightly longer, but it was huge  – “Steve Jobs Dies”.
    I’ve admired the man and his achievements for so long, and I’ve been using Apple products since I was 10 or 12, since before Jobs’s prodigal return even, when there was no Ive factor to admire. Back then, being an Apple fan was a minority interest, a tribal identification, and one that frequently inspired scorn in others, for reasons I could never quite fathom. To see the last decade of success, where products that manifest a uniquely Jobsian devotion to good taste and a quest for perfection have become so ubiquitous and beloved, is to have that tribal identity vindicated. That loyalty proven right. The huge outpouring of emotion at the news of Jobs’ passing hints that it’s a pretty common feeling around the world, and yet it’s a bond that feels intensely personal, and one I doubt we’ll ever witness again in our lifetimes.

    There has been some beautiful tributes online and around the world, here are some of the ones that caught my eye.

    Apple’s home page is touchingly minimalist and classy and white, as we’d expect.
    Wired have gone for an all-black equivalent, and it’s touching.
    BoingBoing, well, it’s nostalgic and funny, so typical of them.
    The developers at Panic have done a nice little tribute.
    Some fond memories here from tech writers Brian Lam and Walt Mossberg, testaments to the guy’s inherent decency, despite that fierce, tyrannical reputation.
    The Guardian have a gallery of photos of tributes from around the world here.

    I’ve written about the great man for the paper a few times, here’s my article from the time of his resignation, and another from when he announced he was stepping down back in January.

    And Gruber has composed a touching farewell, a beautifully observed vignette. Read it, it’s what you’ll end up remembering about how Jobs faced the end of his life.

    It was news I knew I didn’t want to hear, that I knew would be upsetting. But I find it comforting, somehow, that I found out from John Gruber on my iPad – a hint of Steve’s influence on my life, my interests, my priorities, my sense of self. Now time to go about staying young and staying foolish.

  • The Netflix Way

    September 23, 2011 @ 3:02 pm | by Davin O'Dwyer

    The announcement last Sunday that Netflix was to separate the DVD rental-by-post business that made it famous from its movie downloading business generated all sorts of criticism in the US tech press and blogosphere, and not just because the new name was the palpably lame moniker Qwikster (isn’t that a chocolatey powder drink?).
    It’s not hard to see why everyone got so riled – chief exec Reed Hastings published a lengthy blog post trying to explain the thinking behind July’s decision to charge for Netflix’s movie downloading and DVD rental business separately , rather than the attractively low, bundled price it had previously been. But instead of explaining that, Hastings actually introduced further inconveniences to the service: “A negative of the renaming and separation is that the and websites will not be integrated,” he wrote. For Netflix customers, not only were they being charged more, but the service just got worse.
    Now this is of merely academic interest to Irish film-lovers – Netflix is North American only, unfortunately – but what is of interest is Hastings’ rationale for focusing on streaming and separating the two sides:
    “For the past five years, my greatest fear at Netflix has been that we wouldn’t make the leap from success in DVDs to success in streaming. Most companies that are great at something – like AOL dialup or Borders bookstores – do not become great at new things people want (streaming for us) because they are afraid to hurt their initial business. Eventually these companies realize their error of not focusing enough on the new thing, and then the company fights desperately and hopelessly to recover. Companies rarely die from moving too fast, and they frequently die from moving too slowly.”
    Now this is just common sense – DVD by mail might be a lucrative business now, but it’s obviously a dead dodo walking, with a “best before date” looming in the next five to 10 years. And there are plenty of pundits out there suggesting alternative motivations behind the split (studios demanding per-subscriber payments, possible takeover by Amazon, yada yada yada).
    But Hastings is articulating a critical business truism that is so rarely heeded – changing from one revenue stream to another revenue stream that will be more lucrative down the line often means harming the current cash cow. It’s a brave move, and such bravery is exceedingly rare in companies that are actually profitable – that’s why many operations only innovate and reinvent themselves when their backs are against the wall and their current revenue stream faces collapse, which is obviously a bad time to be reinventing yourself.
    There are plenty of examples of this phenomenon – Microsoft’s reliance on Windows and Office licensing has long acted as a major hindrance on its ability to innovate. Only now that Apple is reinventing personal computing with the iPad is Microsoft showing a willingness to change its ways with Windows 8 and Metro.
    And it’s not just the technology business that’s prone to this quandary – newspapers are in a similar position, trying to shore up their legacy paper operations while building digital platforms to carry them into the future. Needless to say, the past and the future aren’t comfortable bedfellows.
    So well done to Hastings for having the bravery to make this move, but there are more than a few people who think he might get to test his theory that “Companies rarely die from moving too fast”.
  • An Irish top ten university

    September 16, 2011 @ 12:32 pm | by Chris Carpenter

    The QS World University rankings for 2011 were published last week.  There was considerable media comment,  including by the Irish Times, noting that most Irish universities were continuing their recent downward trend:  TCD has slipped down 13 places to 65;  UCD is down 20 places to 134; NUI Galway down 66 places to 298.  However UCC is up marginally up 3 places to 181,  and RTE also reported that UCC was the first Irish university to receive five gold QS stars!  Nevertheless, my own reading of the QS results shows that in fact the University of Limerick also has five such stars.  The QS star rating,  in my view is confusing: it is a new methodology “using more comprehensive indicators to those used in the rankings”.   However,  the QS star rating is an “opt-in” mechanism in which universities volunteer to participate:   for examples the universities ranked 4th, 5th, 6th and 7th in the world (Yale, University of Oxford, Imperial College London and UCL) have no QS star rating :-)

    Also last week,  many in Waterford made a renewed call to upgrade WIT to full university status.   Our Government is reputedly divided on this particular issue at this time,  with some thought to be in favor and others concerned about further dilution of the already scant national education budget.   As I understand it,  some other institutes (amongst them DIT and CIT) are also seeking university status.

    Ireland already has seven universities (DCU; NUIs Galway and Maynooth; TCD; University Colleges Cork and Dublin;  and the University of Limerick). Of these seven,  four – i.e. 57% – are in the top 300 universities in the world (according to the QS rankings)!  By contrast,  the UK has 115 universities,  with 30 in the top 300 – i.e. 26%.  The United States has 4,084 universities offering 4 year undergraduate courses;  of these,  only 69 are in the world’s top 300 – i.e. less than 2%!

    When challenged over the slippage of most of the Irish universities down this year’s rankings,  those involved have argued for an increase in funding,  including the re-introduction of fees for those who can afford to pay.   It thus seemed interesting timing that also last week UCD included an eight page special report in the Irish Times on the new UCD Science Centre.  According to the report,  this represents a €300M investment in total across a number of phases.  The first phase opened last week at a cost of €60M,  the vast bulk of which apparently came from the Irish taxpayer.  The second phase will cost €110M,  with €80M coming from the Irish taxpayer and a further €30M from private sector fund raising.

    In many universities in the United States,  it is quite common for freshmen undergraduates to be continually reminded that they are able to be in their university,  with its buildings and facilities and faculty,  largely due to the generosity of those students who have gone before them and who now,  as alumni, have given back so that a new generation may also benefit.   The message to each new student is clear:  we expect you to donate back to your alma mater during your career.   In my view,  sadly,  this message is rarely as enthusiastically delivered to freshmen in the Irish university system.   However,  even if it were,   it would probably not in fact be true:  students in Irish universities are in their university,  with its buildings and facilities and faculty,  largely due to the generosity of the Irish taxpayer and Irish society at large.   The implication should be clear:  our Irish nation has a right to expect that each student will contribute back to Irish society during their career.   Furthermore,  the Irish universities are owned by the Irish taxpayers,  and should be thus as widely accessible as possible to citizens who have the appropriate educational attainment.

    The strength of the national university system and the quality of the graduate pool underpins our economic growth.  The better the education of graduates,  the more likely that companies will undertake ventures which add high value,  and the less internationally mobile these ventures will be (one of the most disturbing aspects of last week’s announcement about the closure of the Irish operations of Talk Talk last week is the apparent ease at which this venture can be relocated to a lower cost location).  Equally,  our graduates should be mentored to be entrepreneurial,  articulate and creative.  Also last week,  Professor Brian MacCraith and President of DCU announced his “Generation 21″ initiative to ensure graduates are employable no matter how uncertain the future:  each student will develop an “e-CV” as they progress through DCU,  auditing their learning and character develop.

    Cultivating confidence and eloquence in students is one of the major benefits of the Science Gallery (of which I am founding Chairman) at TCD.  The Gallery brings together scientific discovery and artistic expressiveness,  reminiscent of the great natural philosophers and artists,  which science’s specialized narrow focus has lost today. Our student mediator programme fosters students to be articulate and expressive, able to explain themselves and technology to the public visitors to the Gallery exhibitions.  Employers have been noticing the quality of students who have been trained as mediators at the Gallery.

    Whilst employers do consider the QS rankings and character,  employers are increasingly using other techniques to evaluate graduates as I’ll explain just below.   Rather,  the QS ranking system is intended for international students,  providing a tool to help them chose a top university appropriate to their budget.   The business of international students can be a significant addition to a university income.  This business used to be largely a one-way traffic of Asian students heading to western universities,  but today the picture has considerably changed.  While there are 440,000 Chinese students abroad,  China is targeting 500,000 international students to come to its own universities.  There are now more “international” students taking degree courses from UK universities in their own home countries,  than there are international students actually coming to the UK:  there are some 340,000 students attending branch campuses of UK universities overseas.   Some surprising new international university locations have emerged,  such as Germany.  For example the Ruprecht-Karls-Universität Heidelberg and the Technische Universität München are both higher than any Irish university in the QS rankings:  both have undergraduate and postgraduate study through English, and at very low cost to EU nationals.

    A further trend is towards distance learning and online education.   This term,  for the first time ever,   anyone can register and attend (live over the internet) certain undergraduate degree courses offered by Stanford University’s Department of Computer Science.  Online students are expected to submit assignments and attend online examinations,  and will receive a certificate of attainment at the end.  Announced just a few weeks ago,  so far over 100,000 students have registered worldwide.

    One may question the value of online and distance learning,  and perhaps even more so the large portfolio of online podcasts of educational material available,  compared to a traditional university teaching.  However,  given the internationalization of global business and education,  it is becoming increasingly difficult for employers to accurately judge the calibre of graduate candidates from so many different universities.   One obvious way to overcome this is to run examinations as a part of the interview process:  regardless of the degree status and university ranking,  or even whether a candidate actually attended any physical campus but instead studied online,   a graduate can be put directly to the test.   Many professional bodies – in engineering, law, accounting and so on – already insist on their own independent examination systems:  the move to online study may accelerate such assessment mechanisms for many employers.

    Where does this all leave the Irish third (undergraduate) and fourth (post-graduate) level education system ?  Whilst 57% of Irish universities are in the top 300 QS ranked worldwide,  none are in the top 10 or even the top 50.  Online study (such as the new Stanford model) increasingly available from the top 10,  or even the top 50,  taken together with independent assessment by employers and professional bodies,  threaten established lower-tier universities.  The emergence of English as the teaching language of choice in non-native English but top universities,  puts further pressure on the lower-tiers.

    In my view,  the Irish national system must make every effort to have one Irish university amongst the top 10 in the world.  Given that this top-10 university is likely to be substantially funded by the Irish taxpayer,  this university should be widely available to all appropriately attained students in Ireland (together with international students).  Thus,  the emergence of online access is not only a threat,   but an opportunity.   If we had a single Irish university amongst the top 10 in the world,   with the very best faculty and teaching available,  then we can ensure its education is available nationwide through a combination of online access,  but also augmented by “branch” campuses as need be in further locations across Ireland.   I believe there is a compelling argument and urgency for a focus of our taxpayers resources into just a single Irish university,  having online access and a branch network,  with the specific aim of nationwide availability to an Irish world top 10 university.

Next Page »

Search Innovation