Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Youtstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

YSTV

ADVERTISEMENT
Advertise with us

How can we navigate our path in an interrupt-driven world?

What does the smartphone era, where focus does not seem to be the goal and the quantum of information being assimilated has gone down to “bite sizes”, augur for ‘learning’ and ‘working’ environments?

How can we navigate our path in an interrupt-driven world?

Wednesday May 08, 2019 , 9 min Read

Back in the nineties, as I was meddling with MS DOS run x86 machines, I was fascinated by Terminate-Stay-Resident (TSR) programs. These programs would keep running in the background even while you run your “main” programs — I say this well aware that it does not sound anything exciting in today’s world of multitasking systems.


Well, the catch is that this was on the MS DOS operating system, which was a single process flat memory OS — yes seemingly the worst possible design for an operating system — but on the positive side, made me fall in love with computers back then, since you now had a barebones machine to hack, learn, and build stuff.




The Interruptible Computer Processor


I then learnt that these TSR programs once run would go and rewrite the interrupt service routine table of the BIOS/DOS system to point to their code. The interrupt service routines were pieces of code that would get executed when some event happened — it could be a press of a key on the keyboard, the click of a mouse, or the processor timer firing at periodic intervals. This would mean that the processor would get interrupted on what it is currently doing and attend to this event. The processor would stop what it was doing to execute the corresponding interrupt service routines and then resume its original task.


A TSR program could hook itself to a timer interrupt and have its code would executed periodically, and it would seem as if these programs were sort of co-executing with the main programs you were running. It intrigued me to realise that the interrupt service routines (the actual MSDOS/BIOS code that was supposed to run) were actually getting executed almost all the time, by interrupting the computer processor that was executing the user program. This nature of multitasking was mind-boggling at that time since computers were also new to me and the object of reference was humans — I could just not imagine the human brain being interrupted so often in the first place and yet doing useful work.


PS: By the way, most DOS viruses were actually TSR programs.


Interrupts — the new way of life


Fast forward a couple of decades, thanks to the smartphone revolution, if I was amazed back then, I am flabbergasted now. No, not at processors— but at the volume of interrupts human brains are handling these days. Every other second there is a notification on some app or the other on your smart device interrupting you — someone liked your post, somebody started following you somewhere, someone is vociferously refuting your point of view, someone has posted a new photograph, somebody scored a goal, and, can you believe, something just happened in the world?!


Interrupts — a way of life today


I look around and find everyone just glued on to the phone and I bet it is not just one app; they are just switching through an array of apps. They place the phone aside for just a few seconds before they are checking it again — a new notification? Wait, the human brain is just checking if there are any— that is, the human brain, now trained on interrupts is now interrupting itself to check if there is an interrupt — well the human brain seems to have gotten the good old timer interrupt of those DOS / BIOS days!


Human brain, the new Operating System for apps


Which is the App Store? And which is the OS?There is almost an interrupt-driven economy at play — it seems that businesses are making money by interrupting you. They want a slice of your time; they want you to use their app, they want your brain to use their app. In other words, they want their apps to run on your brain. So they keep interrupting you!


Your brain is the real operating system — not your mobile operating system . The more time their apps get on your brain, they make more money — that is the economy. Interestingly that makes the OS on your mobile almost an app store from where these apps get installed on to your brain.All this begs the question of whether the human brain is even “designed” to getting interrupted. And what is the productivity of the human brain in the smartphone era?


A prediction way ahead of time


In the year 1971, Hebert A Simon predicted that in the information era, wealth of information would create dearth of something else — attention of the recipient. His prediction has eerily come true today — the information explosion has created a war for grabbing attention.


But guess what, instead of the winner takes all, they have all decided to share the pie of attention almost collaboratively. Meaning that the attention time slices for some of these tasks would go into seconds and minutes before brain is “notified” to look at something else. This has caused the brain being supplied only with information of a few bytes (to use the computer storage parlance) each time, which is a pity, for the amount of learning is so curtailed now. Sadly this has given birth to a new paradigm of learning interestingly called bite-sized learning (not byte-sized!).


And mind you — attention time slices of mere seconds and minutes are something that humans were never really used to in their two million years of evolution! By the way, an interesting tidbit is that Hebert A Simon went on to win the Nobel Prize equivalent for computer science with the Turing award for his early contributions to artificial intelligence (AI). And then he went on to win the Nobel Prize in economics!


Learning from machine learning


I talked about what captured my imagination as a rookie computer science engineer back then. Now that the word AI has crept into my article, undoubtedly my fascination these days has been towards machine learning algorithms (as with most people I guess right now!) at a broad level, including deep learning algorithms.


Given that we have been juxtaposing the human brain with computers, I just suddenly realised the following — as machine learning students and practitioners, while we are so focused on the “learning” of machines, what are we doing for humans to learn? In fact I must admit that I went on a mild guilt trip thinking more about this and wondering if I even spent my last several years learning enough.


The question I want to pose at this juncture, as we talk of learning, is whether the interrupt-driven functioning of our brain is conducive to learning.


Multitasking within the human brain


An interesting study done at UCLA tried to measure the impact of multitasking on learning. An activity learnt without distraction resulted in triggers in the Hippocampus areas of the brain, as per the fMRI scans, while the same activity learnt while multitasking was triggering activity in the area meant for learning of new skills called the Striatum. The group that performed the task without distraction were able to get deeper insights around the task that the multitasking group had no clue about.


In essence “deep learning” was happening within the brain in a uni-tasking environment while the multitasking environment seemed to push the brain to hand off control to areas of the brain that manage unknown environments such as new skill learning. The distractions were probably making the brain consider it as new, so the focus would just be to manage (or wing it!) than a deeper insightful learning.


The Maker vs Manager debate


Machine learning algorithms typically have two phases — learning phase and the apply or the “do phase”. The same holds true for humans too. Now is a multitasking setting conducive for the do phase — to do things? Definitely not for tasks like driving. Let’s see if multitasking is an inherent need for certain tasks.


Some of us might recall the maker vs manager schedule arguments that Paul Graham, Co-founder of Y-combinator, talks of makers needing uninterrupted day plans to get things done. Managers, meanwhile, can keep context switching and can plan their day around one-hour intervals. Essentially the task of a manager has multitasking ingrained in it, in many ways. However even for a manager, Paul talks of hourly schedules for productivity.


In the smartphone era, I think a lot of the folks who are hooked to their devices are getting context switched in and out in intervals of seconds and minutes.


Maker vs Manager Schedule?


Excuse me, May I … ?

We all can recall that during the times when we grew up, it was impolite it was to interrupt someone. These days, I guess you still may or may not be able to interrupt somebody in person, but with a polite notification of a messaging app, you can get their uninterrupted, err… “interrupted” attention. In this situation where focus does not seem the goal, and quantum of information being assimilated has gone down to “bite sizes” due to the ever-decreasing attention time slices, what does this augur for “learning” and “working” environments?


The answer


The answer, I guess, is blowing in the wind. As Herbert predicted, the information explosion has already made attention so scarce. A scarce-attention situation is going to make “deep” human learning hard. But then the only way to survive in a world of machine learning is good old “human learning”. And that deep human learning, at this point, particularly with our yet-to-evolve brains, does seem to require sufficiently long uninterrupted time slices with undivided attention. So, let’s keep that in mind and embark on a journey of “deep learning experiences” all through our lives!



(Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)