Prefer watching instead of reading? Watch the video here. Prefer reading instead? Scroll down for the full text. Prefer listening instead? Scroll up for the audio player.
P.S. The video and audio are in sync, so you can switch between them or control playback as needed. Enjoy Greyhound Standpoint insights in the format that suits you best. Join the conversation on social media using #GreyhoundStandpoint.
There’s been a flurry of recent news and product releases on artificial intelligence and the amazing things it can do for us.
Of course, as somebody who researches the topic and advises leaders on both the buy-side and sell-side, I am ecstatic about the possibilities with AI. In real terms, the layers of improvements it can bring about regarding productivity, output and more are eye-opening! This is just the start, and as many would say, we are barely scratching the surface with artificial intelligence.
While this is all wonderful news, I am forced to ask: is all this intelligence making us a tad stupid, if not a lot?
While this question may sound bizarre, I’m not the first or last to pose it. Read this profoundly personal note to understand my rationale. This note sheds light on my worries for the upcoming generation, which may completely miss out on the basics that most of us have grown up with. These basics make us humans unique and have formed the bedrock for most intelligence documented in recent history.
Here’s some context from my personal experiences
As part of a master’s degree curriculum, one is tasked to write a dissertation on a topic of choice. Since I finished two master’s degrees from the UK in succession, I also had the pleasure of authoring two dissertations in two consecutive years. Pun intended. And to be clear, both dissertations weren’t a group effort but a solo expedition I took head-on. Those who have been through a dissertation will understand that writing even one is a tall ask. So, two in a row is nothing short of an ordeal. But yes, lots of good came out of it. More on that later.
For those who don’t know, a dissertation is often considered the father of multiple theses since it is an extensive and detailed exercise. It is seen as the holy grail on a topic from which multiple theses can be written. Typically, any dissertation worth its salt can take around 6 months to complete, ranging between 50,000 to 100,000 words or more. Interestingly, many end up with an appendix that is more detailed than the main report. That’s the level of detailed analysis required to make a dissertation worthy of the holy grail. This level of analysis is also often seen as the bare minimum for any individual to be termed an expert on the topic.
The process of researching, analysing and writing the dissertation is what truly sets it apart and makes it the gold standard in critical thinking and reasoning. The individual, or a group, is expected to read through and analyse hundreds, if not thousands, of research journals, books and other academic papers. The intent is not only to gather insights from existing academic content to support the hypothesis but, more importantly, to analyse this content with a lens of critique to find points of disagreement, if not gaps. Imagine critically analysing theories and research written by experts with doctorate degrees, decades of experience and sometimes even notable awards. Of course, all these insights and disagreements need to be proven or otherwise by extensive field research that spans both qualitative and quantitative methodologies.
As exhausting as the above paragraphs sound, the process is, in the truest sense, lengthy, time-consuming and, most importantly, intense.
So, how did writing two dissertations do me good?
This hits home since I have learned immensely from writing these dissertations and have replicated them extensively in my work as an Industry Analyst and an entrepreneur. Allow me to explain this; I will avoid specifics since the intent is to highlight key learnings and no more.
1/ The most important undertaking of any research exercise, let alone a dissertation, is the need to choose a topic with a sharp hypothesis that needs testing, a holistic plan to cover the most ground in the said hypothesis, a thorough methodology and a well-rounded structure that covers all key details for the readers.
Skills onboarded – research design, research methodologies, content structure and design, effective planning and writing, time management, critical thinking, communication and more.
2/ Dedicate sufficient time to reading research journals, books and other academic content. This step is critical both before setting the hypothesis and then after the launch of the research to cover ample academic ground. As mentioned earlier, this not only helps find solid evidence to prove the hypothesis and gather insights but, more importantly, helps the author find gaps in the already published academic writings and push new groundbreaking ideas on the said subject.
Skills onboarded – critical thinking and reasoning, research design and understanding, literature review, gap identification, evidence evaluation, scientific writing and other skills.
3/ Preparing and administering questionnaires and surveys, cleaning and analysing data to share the not-so-obvious outcomes. Remember, data is much like a diamond. It needs cleaning, polishing and shaping before it can be considered precious. From years of working with data, I can confidently share that any data set almost always has many hidden gems. The idea is to correlate data points from different values and find outcomes that the naked eye may otherwise miss.
Skills onboarded – questionnaire and survey skills, statistical analysis skills, statistical tools like SPSS, advanced dashboard tools, data cleaning, data correlation and a host of other skills.
4/ Penning the report in an authentic voice that tells a story and not just shares findings from the survey and research. Reporting data from Excel sheets and referencing existing research work is a task that even the novice can attempt. An experienced researcher or an analyst will dwell deep in the subject, connect the dots between the data and research hypothesis and, most importantly, convey this in a manner easily understood by the reader. The most important aspect any experienced hand would take care of is the need to stitch all key outcomes in a way that all comes together as part of the bigger picture and answers the most pressing question for the reader – why do this outcome/trends matter, and what do they mean?
Skills onboarded – grammatical correctness, clarity of voice, an engaging writing style, ruthless editing for sharp content, and delivering content in an engaging, storytelling manner.
5/ Any gold standard research note must ensure it is checked in detail for plagiarism, lists sources and endnotes in Harvard style referencing, and puts together a thorough annexure. The fact is, anyone can read and write a research note. But then, not all research notes are the same. Some are half-hearted attempts and lack the rigour to make them world-class on operational accounts.
Skills onboarded – the importance of plagiarism and fact-checking, listing resources in a precise manner, information management, content thoroughness, paraphrasing and lots more!
How is AI stopping an entire generation from learning core skills?
Well, some would say that the question above is inaccurate. They would even go to the extent to say that I am overstating the impact of artificial intelligence. As someone who has been through the learning curve before the emergence of AI and also used AI tools to create research content, I can confidently say otherwise. Let me share an example.
Only recently, the Indian Government announced the union budget for the upcoming financial year. Unlike each year, where I read through each line of the budget and analysed each line item of the expenditure sheet, this time, I used AI tools to analyse the announcement on all things technology, digital and artificial intelligence. While these tools did help summarise the findings from the documents, they naturally missed out on the more complex findings from the announcements that only an experienced eye can catch. More context below.
The Government of India, in this budget, announced a fantastic mission on nuclear energy. Since this announcement does not fit the traditional description of technology and digital, most of these tools did not include the mission in the summary. On the contrary, it is a significant announcement because nuclear energy will be critical to India’s Data Centre ambitions. Globally, be it Google, Microsoft, or other major players, all are increasingly flocking to providers of small modular reactors (SMRs) as alternative energy sources to reach net zero carbon emissions. Had it not been for this insight and knowledge that my team and I have, we may well have missed this in our research note. Furthermore, this mission is all the more important because India has only recently announced its ambitions to participate in the global semiconductor supply chain. Such nuclear energy sources will be a key part of the puzzle to win this game. Again, this is something that most AI tools missed.
Now imagine the implications of something like this when undertaking a study that has hundreds, if not thousands, of such correlations between different announcements, data points and more. Many would argue that AI tools are learning quickly and only recently, OpenAI announced Deep Research meant to solve this problem. I do not doubt that this and other AI tools can become advanced enough to achieve mastery of complex research topics. It’s bound to happen sooner than later.
But the question is, will these tools not stop our learning curve? The fact is, there’s no shortcut to obtaining these core life skills. When we read through documents, analyse the content, and make our inferences, we truly learn and grow. This also applies to things like grammatical errors, checking for clarity of voice and an engaging writing style, and, more importantly, writing factually correct content and research. There’s immense value and skill in learning to dot the i’s and cross the t’s. The sooner we understand it, the better. But the tide is quite the opposite. This is visible with the increase of picture-perfect content across social media that reads robotically and lacks any form of original and creative thinking. I wonder how many of us have genuinely thought the implications through.
Having said all of that, I must point out that, in some ways, we’ve witnessed such changes earlier. Well, this may come as a surprise to a few of you or maybe not. At first, such changes were noticed when calculators started helping us solve basic mathematical equations like addition, subtraction, and even multiplication. Before we knew it, they were being used in schools. Hence, critical and essential skills such as mental maths were lost along the way. We noticed a similar change when Wikipedia was launched. Both students and professionals started copying information from there. The only difference with artificial intelligence is that it’s a much smarter and faster version of this change. That’s all. But we’ve been here earlier, albeit in much dumber ways.
As AI becomes democratised, what are our options?
No, I am not a naysayer on technology and artificial intelligence. I am anything but. However, it is without a doubt that with more AI making its presence felt, we are increasingly not using the basics of critical thinking, reasoning, writing, and more. And it’s well-known that iron when it rusts, becomes weak with time and loses strength. Our core skills emulate very similar behaviour. Either we continue to hone them, or we must learn to live with a somewhat rusty version of them!
While the current generation’s core skills rust, the coming generations will have it much worse. Just like the generation after us doesn’t know a world without the internet, the generation born now or those in their early years wouldn’t know a world without artificial intelligence. While this may not be a problem prima facie, the long-term impact of such changes only means that the current generation will take for granted multiple cognitive functions that our generation and others before us spent years honing. A classic example of this change is the dramatic drop in attention spans and patience levels. According to various studies, the average attention span is about 8-9 seconds. I’m sure only a handful of you made it so far, even for this note. Another excellent example of this change is the loss of calligraphy. There was a time when one’s handwriting used to be a topic of conversation, but it’s now replaced with how fast we can type and now, of course, use AI tools to get work done.
So, what are our options? Well, it all boils down to first developing a better understanding of the impact of these AI tools. It starts with making ourselves aware of these tools’ profound impact on our cognitive functions. It starts with us deciding, on our levels, to stop relying on these tools. In many ways, it’s similar to people turning back to reading physical books and away from eBooks. Better sense has prevailed, but only after much damage has already been caused to our eyes, brains and other cognitive functions. A few studies have even gone to the extent to prove that “readers who used Kindles were less competent in recalling the plot and events in the book than those who used paperbacks”. So, essentially, when using an eBook, our brain tends to retain less, and we miss out on critical information. Now, with AI literally automating months’ worth of research into hours and minutes, imagine its consequences on our mental health and cognitive development. It’s about realising that technology, whether AI or any other, is an enabler and a means to an end, not the end itself.
However, the option we don’t have is not to adopt AI. We like it or not, AI is here to stay, and in a significant way. The best course of action hereon is to encourage using AI fairly and transparently, starting from schools. It’s about learning to use AI tools alongside the core human cognitive functions and using them as garnishing on the main dish. Lest we forget, AI cannot make relationships work. It cannot establish friendships, fall in love, marry, bear children, etc. So, as humans, we need to dial back and better understand what makes us human in the first place. The sooner we realise that emotions, relationships, feelings, and cognitive functions make us human, the sooner we can work out a comfortable relationship with AI.
But as we learn to work alongside AI, we must pause and reflect on whether we are all caught in one large echo chamber. At least, I think we are. This might not seem so obvious at first, but like it or not, the sophisticated PR and marketing machinery aided with big fat budgets are pushing the agenda on AI so hard that we are failing to see beyond. Be it social media, videos, podcasts, or even newspapers, everyone is so consumed talking about the advancements and possibilities of AI that most of us aren’t stopping to reflect. Those who do are countered by the logic that this is just like the wave of industrialisation we have all lived through, and that change is inevitable. While all of that is true, what is also true is that we are leaving behind a much poorer earth. We are leaving behind a generation too busy scrolling short videos and not focusing on developing core skills and specialisations.
Net-net, we are leaving behind an ailing earth with depleting resources, smarter machines and generations that will struggle with life and cognitive skills. I sometimes wonder what history will teach future generations about those who lived before them, i.e. us.

Analyst In Focus: Sanchit Vir Gogia
Sanchit Vir Gogia, or SVG as he is popularly known, is a globally recognised technology analyst, innovation strategist, digital consultant and board advisor. SVG is the Chief Analyst, Founder & CEO of Greyhound Research, a Global, Award-Winning Technology Research, Advisory, Consulting & Education firm. Greyhound Research works closely with global organizations, their CxOs and the Board of Directors on Technology & Digital Transformation decisions. SVG is also the Founder & CEO of The House Of Greyhound, an eclectic venture focusing on interdisciplinary innovation.
Copyright Policy. All content contained on the Greyhound Research website is protected by copyright law and may not be reproduced, distributed, transmitted, displayed, published, or broadcast without the prior written permission of Greyhound Research or, in the case of third-party materials, the prior written consent of the copyright owner of that content. You may not alter, delete, obscure, or conceal any trademark, copyright, or other notice appearing in any Greyhound Research content. We request our readers not to copy Greyhound Research content and not republish or redistribute them (in whole or partially) via emails or republishing them in any media, including websites, newsletters, or intranets. We understand that you may want to share this content with others, so we’ve added tools under each content piece that allow you to share the content. If you have any questions, please get in touch with our Community Relations Team at connect@thofgr.com.
Discover more from Greyhound Research
Subscribe to get the latest posts sent to your email.
