Elon Musk's Appetite for Destruction



critical writing :: Article Creator

How To Improve Your Critical Thinking Skills

Source: Hannah Olinger / Unsplash

Technology provides access to vast information and makes daily life easier. Yet, too much reliance on technology potentially interferes with the acquisition and maintenance of critical thinking skills in several ways:

1. Information Overload: The constant influx of data can discourage deep critical thinking as we may come to rely on quick, surface-level information rather than delving deeply into a subject.

2. Shortened Attention Span: Frequent digital distractions can disrupt our ability for the sustained focus and concentration required for critical thinking.

3. Confirmatory Bias and Echo Chambers: Technology, including social media and personalized content algorithms, can reinforce confirmation bias. People are often exposed to information that aligns with their beliefs and opinions, making them less likely to encounter diverse perspectives and engage in critical thinking about opposing views.

4. Reduced Problem-Solving Opportunities: Technology often provides quick solutions to problems. While this benefits efficiency, it may discourage individuals from engaging in complex problem-solving, a fundamental aspect of critical thinking.

5. Loss of Research Skills: The ease of accessing information online can diminish traditional research skills, such as library research or in-depth reading. These skills are essential for critical thinking, as they involve evaluating sources, synthesizing information, and analyzing complex texts.

While technology can pose challenges to developing critical thinking skills, it's important to note that technology can also be a valuable tool for learning and skill development. It can provide access to educational resources, facilitate collaboration, and support critical thinking when used thoughtfully and intentionally. Balancing technology use with activities that encourage deep thinking and analysis is vital to lessening its potential adverse effects on critical thinking.

Writing is a traditional and powerful tool to exercise and improve your critical thinking skills. Consider these ways writing can help enhance critical thinking:

1. Clarity of Thought: Writing requires that you articulate your thoughts clearly and coherently. When you need to put your ideas on paper, you must organize them logically, which requires a deeper understanding of the subject matter.

2. Analysis and Evaluation: Critical thinking involves analyzing and evaluating information. When you write, you often need to assess the validity and relevance of different sources, arguments, or pieces of evidence, which hone your critical thinking skills.

3. Problem-Solving: Writing can be a problem-solving exercise in itself. Whether crafting an argument, developing a thesis, or finding the right words to express your ideas, writing requires thinking critically about approaching these challenges effectively.

4. Research Skills: Good writing often involves research, and research requires critical thinking. You need to assess the credibility of sources, synthesize information, and draw conclusions based on the evidence you gather.

5. Argumentation: Constructing a persuasive argument in writing is a complex process requiring critical thinking. You must anticipate counterarguments, provide evidence to support your claims, and address potential weaknesses in your reasoning.

6. Revision and Editing: To be an influential writer, you must learn to read your work critically. Editing and revising requires evaluating your writing objectively, identifying areas that need improvement, and refining your ideas and arguments.

7. Problem Identification: In some cases, writing can help you identify problems or gaps in your thinking. As you write, you might realize that your arguments are not as strong as you initially thought or that you need more information to support your claims. This recognition of limitations is a crucial aspect of critical thinking.

Writing is a dynamic process that engages multiple facets of critical thinking. It has been a valuable tool used in education, business, and personal development for centuries.

Yet, this traditional approach of self-generated written thoughts is rapidly being supplanted by AI-generated writing tools like Chat GPT (Generative Pre-trained Transformer. With over 100 million users of Chat GPT alone, we cannot ignore its potential impact. How might the increasing reliance on AI-generated writing tools influence our critical thinking skills? The impact can vary depending on how the tools are used and the context in which they are employed.

Critical thinking involves evaluating information sources for credibility, relevance, and bias. If individuals consistently trust the information provided by chatbots without critically assessing its quality, it can hinder their development of critical thinking skills. This is especially true if they depend on the chatbot to provide answers without questioning or verifying the information. Relying solely on chatbots for answers may also reduce people's effort in problem-solving. Critical thinking often requires wrestling with complex problems, considering multiple perspectives, and generating creative solutions. If we default to chatbots for quick answers, we may miss opportunities to develop these skills.

However, it's essential to note that the impact of chatbots on critical thinking skills may not be entirely negative. These tools can also have positive effects:

1. Chatbots provide quick access to vast information, which can benefit research and problem-solving. When used as a supplement to critical thinking, they can enhance the efficiency of information retrieval.

2. Chatbots can sometimes assist in complex tasks by providing relevant data or suggestions. When individuals critically evaluate and integrate this information into their decision-making process, it can enhance their critical thinking.

3. Chatbots can be used as learning aids. They can provide explanations, examples, and guidance, which can support skill development and, when used effectively, encourage critical thinking.

In summary, the impact of chatbots on critical thinking skills depends on how we use them. The effect will be harmful if they become a crutch to avoid independent thought or analysis. However, they can be valuable resources when used as tools to facilitate and augment critical thinking and writing processes. Individuals must balance leveraging the convenience of chatbots and actively engaging in independent critical thinking and problem-solving to maintain and enhance their cognitive abilities. You can do that effectively through writing regularly.

Copyright 2023 Tara Well, PhD


AI, Expertise And The Convergence Of Writing And Coding

Is it possible yet to see through the fog of hype about generative artificial intelligence? Can we now confidently predict its long-term impacts on higher education and white-collar professions and adapt accordingly?

I think so. Let's consider two skills crucial to many professions: writing and coding.

Despite their distinct goals and histories, these two skills face a convergent future. As practiced by professionals with expertise in their fields, writing is both critical thinking and an evolving technology. So is coding.

Most Popular Stories

Most Popular

Yet generative AI seems to obviate the need to think—beyond the ability to write prompts or simply copy and paste prompts crafted by so-called prompt engineers.

In practice, generative AI can impact the teaching and practice of writing and coding in two opposing ways:

  • Democratizing access to expertise, or specialized rhetorical knowledge, about everything from leading-edge medical research to idiomatic, corporate-friendly American English. (Such expertise, of course, was developed by humans with considerable time and effort, looted from the internet via CommonCrawl and GitHub, and filtered for bias and hate speech by underpaid ghost workers in Kenya).
  • Erode professional expertise and literacy, with disastrous consequences, due to overreliance on AI.
  • In the words of the well-known AI-skeptical piece "On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?" "most language technology is built to serve the needs of those who already have the most privilege in society." And those with the most privilege in society also, typically, have the most expertise—or at least the most access to it. Generative AI, at this point, is designed to serve experts, not novices. Not students.

    How can educators support No. 1 and limit No. 2? In short, through the cultivation of critical editing skills: the application of discipline- and context-specific expertise and sociopolitical awareness of the rhetorical situation—the ability to deeply analyze and understand the audience, purpose, genre of writing or type of code, and context for a given document or program—to edit draft text and code generated by AI.

    As Robert Ochshorn, software developer and Reduct.Video CEO, says, "Critiquing and editing code is a big part of a software developer's job, and when I was in school, it wasn't something they knew how to talk about."

    In writing, we are accustomed to editing as a late stage in the process, when we refine sentence structure, word choice and style. Though we fact-check and keep audience in mind, usually we consider our audience's needs and collect facts early in the writing process—when we face the blank page.

    However, a range of experts, from communication specialists drafting press releases to scientists drafting grant proposals, no longer face the blank page. They can begin with a draft created by a chat bot. It's also true that professional programmers in the era of generative AI rarely face the blank page—but then, they haven't done that since at least 2008 with the advent of GitHub. (GitHub is the world's largest repository of open-source code, with more than 100 million users as of January; Microsoft acquired it in 2018.)

    Critical editing, then, must bring more critical thinking into the editing process. Professionals must be able to assess how well AI-generated draft text or code accomplishes their purpose and meets the needs of their audience. This cannot be achieved simply by crafting better prompts for AI. It requires developing and applying expertise. To cultivate critical editing skills, educators must therefore remain focused on developing students' expertise, in key phases of many writing and coding assignments, with little to no help from AI.

    Though college instructors need to adapt, we need not feel lost. Where professional writing is headed, coding has been before and has some cautionary tales to offer. Conversely, where coding is headed, writing studies has some guidance to offer. And the stakes are high. If we don't get this right, the erosion of expertise, due to overreliance on AI, will severely impact higher education and the larger economy and could make costly, dangerous errors in the professional workplace routine.

    The Convergence of Writing and Coding

    Generative AI has brought writing and coding closer than ever before, not least because it threatens to render both obsolete. Many think pieces published this year predict "the end of writing." Quite a few also predict "the end of programming as we know it," because generative AI tools allow users to create programs by writing prompts in English (and other so-called natural languages), not coding languages.

    Historically, automation has not been a goal of writing. Conversely, automation is the primary goal and basis of coding, which streamlines repetitive tasks to save time and effort. Yet learning to code by learning to think like a computer—even to write a simple program that can play tic-tac-toe—requires tremendous patience and self-discipline. Coding is a form of writing—writing the strings of rigorously logical commands that run the tools we use in every aspect of contemporary life.

    And though writers and editors collaborate, compared to writing, coding is an extremely collaborative activity. Open-source culture is liberal, sharing code under permissive licenses. In the culture of coding, if someone else has written and freely shared a script that accomplishes a particular task, why on earth would you write it from scratch? In academic writing culture, on the other hand, reusing text written by others is simply plagiarism.

    Generative AI, for some, is nothing but a plagiarism machine that makes it impossible to trace or credit the experts whose intellectual property or polished style has been plundered—even when, like Bing Chat, it can list sources in its response to a prompt. When a large language model learns to write smooth, idiomatic prose from billions of documents, no individual writer gets any credit. Generative AI has imposed the culture of coding, with its benefits and risks, on the culture of writing.

    One moment perfectly captures the convergent fates of writing and coding. In December 2022, moderators of Stack Overflow, "a popular social network for computer programmers … banned A.I.-generated text" because users were "posting substandard coding advice written by ChatGPT." As a moderator said, "Part of the problem was that people could post this questionable content far faster than they could write posts on their own … 'Content generated by ChatGPT looks trustworthy and professional, but often isn't.'"

    What was the rest of the problem? Why were programmers fooled by ChatGPT's coding advice?

    One reason: as a profession, software developers have depended on others' expertise, by sharing and reusing code and using AI-driven code-completion tools, for decades. This dependence speeds up software development—but overreliance on others' code also opens the door to more errors and vulnerabilities, including the case that "nearly broke the Internet" in 2016.

    Another reason: the popular, latent misconception that equates "good writing" with flawless grammar and a sophisticated vocabulary. ChatGPT has surfaced this misconception. In writing by humans, good writing typically depends upon good thinking. We are not used to getting one without the other, and so we are confused by error presented with slick delivery. But it takes expertise to recognize expertise—and to notice factual inaccuracies and other lapses.

    Because experts can recognize and correct errors and oversights when they critically edit a chat bot's draft, they are delighted to incorporate generative AI into their workflow to increase efficiency (e.G., doctors who use chat bots as scribes), while educators and conscientious students feel nervous about doing so. Not because they are unjustly maligned Luddites (or Hollywood screenwriters in fear for their jobs). But because the goal of education is to learn and develop expertise, not to save time and effort for already-trained experts.

    "[With Copilot] I have to think less, and when I have to think it's the fun stuff. It sets off a little spark that makes coding more fun and more efficient," says an unnamed senior software engineer in a promotional blog post for GitHub Copilot, an AI coding-assistant tool that generates code in response to prompts in English.

    Good for the senior software engineer. But what about those who teach coding? What about students?

    Since its launch in July 2021, GitHub Copilot had already disrupted the teaching of coding as much as ChatGPT is now disrupting the teaching of writing. Researchers at Gonzaga University in 2022 found that Copilot "generates mostly unique code that can solve introductory [computer and data science] assignments with human-graded scores ranging from 68% to 95%." ChatGPT has intensified the challenge for the teaching of coding, since it can write code, too.

    Obviously, in this new environment, students will still need to learn to code from scratch. But they also need to develop strong critical editing skills so that they can judge and edit AI-generated drafts. The pressing need to develop programmers' critical code editing skills is nothing new—but generative AI has made it more obvious.

    Slowing Down in the Computer Science Classroom

    Matthew Butner and Joël Porquet-Lupine, who teach computer science at the University of California, Davis, are worried. GitHub Copilot and ChatGPT, in Butner's view, are "useful tools to accelerate learning, but my worst fear is that [students] will cheat themselves out of learning." To the best of their ability, they want to prevent students from using any generative AI tools in introductory courses, and they plan to proctor all exams.

    "We're going to have to change the whole assignment structure [in the computer science major] sooner rather than later," says Porquet-Lupine. He adds that students must "first learn the fundamental programming skills in introductory classes," without AI, and that instructors must "redesign the advanced classes to integrate ChatGPT [and Copilot] properly, and legally"—addressing the problematic fact that both tools mostly rely on open-source code without license or acknowledgment.

    Recognizing that students need to slow down to learn to code well, in the fall of 2022, Butner began to lead them through a problem-solving guide. Students must define the problem, research how others have solved it, write the steps in English for solving the problem, submit all this to be graded and only then begin to write the code.

    He requires these steps precisely because students were rushing to write code before they really understood the task, drafting code that did not work and failing to learn crucial coding skills such as identifying logical errors and gaps, balancing performance and flexibility in software design, and so on.

    Butner does allow students to use AI coding assistants like Copilot in advanced courses and on collaborative projects—and of course, professional software developers and software engineers use these tools, too. They report that these tools increase efficiency but are "useless for anything novel," in the words of software engineer Kushant Patel (formerly of the Center for Mind and Brain at UC Davis, now at the Lawrence Berkeley National Laboratory).

    Patel worries that, if programmers overrely on it, generative AI will quickly run out of the expert code, written by humans, that it needs to train itself. And he feels strongly that students should not be allowed to use AI in coding before the age of 13 or 14, and only then with "mandatory training."

    Critical Editing (for Audience, Purpose, Genre and Context)

    We cannot put the latest Pandoras—those smiling corporate vampires that are ChatGPT, GitHub Copilot, Google Bard, etc., which feed on human text and code rather than blood—back in the box. We must prepare our students to enter a workforce that is incorporating generative AI into every conceivable workflow.

    To do so, we must define the expertise we mean to teach in our fields, at every level, and ensure that we help students acquire it without overreliance on AI. And whatever time is saved in the classroom by allowing students to rely on generative AI to produce rough drafts or draft code must be devoted to developing critical editing skills, with attention to audience, purpose, genre and context. Practicing these skills will need to happen in the classroom, where students and younger workers especially benefit from the power of proximity to mentors—that is, human experts.

    Experts can use generative AI to make writing and coding more efficient. Novices can use it, as Butner says, to cheat themselves out of learning. And if educators let them do that, generative AI will defeat humans not with some spectacular sci-fi mischief, but simply by making humans dangerously incompetent.

    Marit J. MacArthur teaches in the University Writing program at the University of California, Davis, where she is also associate director of Writing Across the Curriculum (graduate level), and a faculty affiliate in performance studies. Since 2015, she has overseen the development of open-source software for digital voice studies research, with funding from the American Council of Learned Societies, the National Endowment for the Humanities, and the Social Sciences and Humanities Research Council of Canada.


    How Interlinking Learning Promotes Critical Thinking In Middle School

    Every year around Halloween, our middle school in Spain divided students into four assigned houses, venturing beyond the typical core classes to engage in collaborative activities centered around one spooky concept. One year, it was all about pumpkins; the next, vampires became the seasonal topic of choice. 

    This wasn't just a festive tradition: It was a vibrant representation of interlinking learning in middle school—the instructional approach that involves connecting content across different subject areas, fostering a more integrated and holistic understanding of material. Interlinking learning enhances the learning experience, as it allows students to transfer skills and knowledge acquired in one area to another. It also helps students to see patterns, relationships, and contradictions across subjects and promotes critical thinking, a vital skill in today's fast-paced world.

    What is Interlinking Learning?

    Interlinking learning is based on the main principles of connectivity, context, and applicability. It encourages students to build connections between concepts across different subjects, placing learning in a real-world context and emphasizing the application of knowledge in various scenarios. The philosophy can be traced back to John Dewey, a renowned educator and philosopher, who encouraged connected learning and emphasized learning through doing, advocating for a practical, interactive, and student-centered approach to education.

    Interlinking learning fosters vibrant classroom environments where students can make connections between different subjects. Around Halloween, there are a variety of fun activities that create connections for students around a spooky theme; for instance, "pumpkins meet pi" is a spirited lesson on measurement. The tale of turnips (evolving into the pumpkins we associate with Halloween today) is a fascinating history lesson of the Irish immigrating to America. A vampire theme can bring intrigue, with language arts classes dissecting passages from Dracula, and science lessons can become hubs of discovery as students sink their teeth into studying different blood types. 

    Similarly to the Halloween themes of pumpkins and vampires, teachers can use something like a "big idea" for connected learning, where that same idea is seen, felt, and experienced across subjects. The big idea, whether it is adaptability or justice, is an excellent example of interlinking learning, urging educators to focus on a central concept that can transcend individual subjects. 

    Connectivity

    Promoting connectivity in the classroom involves creating a nurturing environment where students can interlink concepts from various subjects. Consider the following strategies:

    Cross-subject projects: Encourage students to work on projects that require knowledge and skills from different subjects, thereby fostering an understanding and application of diverse concepts. For example, a project where students design a sustainable city could require knowledge of geometry (math), an understanding of sustainable practices (science), and historical context (social studies).

    Thematic learning weeks: Organize learning weeks that focus on a specific theme, weaving in various subjects to offer a rich, multidimensional perspective on the topic at hand. For instance, during Space Week, English classes could focus on science-fiction literature, while science classes delve into the solar system's dynamics.

    Connecting concepts to current events: This strategy can involve linking concepts being taught to unfolding events globally, offering students a dynamic and contemporary context to anchor their learning. For instance, social studies could relate a historical event being studied to a current geopolitical situation, encouraging students to see the repercussions and interconnectedness of historical events in today's world.

    Context

    This approach to learning is greatly influenced by situated learning theory, which suggests that learning is most effective when it is closely related to the real-world context in which the knowledge or skill will be applied. Here are some examples:

    Guest speakers: Inviting guest speakers can provide students with a rich contextual background for the topics they are studying, such as a local author discussing their writing process in a language arts class.

    Field trips: Field trips can offer firsthand experiences that enhance understanding and retention. For instance, a visit to a local museum can offer a rich, contextual background for a historical period being studied.

    Case studies: Facilitate deep understanding through case studies, encouraging students to apply conceptual knowledge to solve real-world problems. For instance, analyze a recent environmental case to understand the practical applications of scientific concepts in real-life scenarios.

    Applicability

    Applicability calls for an emphasis on the practical use of knowledge and skills in various scenarios, aiding students in understanding how they can apply what they learn in real-world situations. The following are strategies to consider:

    Problem-solving: Promote active engagement with learning materials through problem-solving, encouraging a cycle of reflective thinking. For instance, setting up a mathematics problem that involves budgeting for a small business can bring real-world applicability to theoretical concepts.

    Internship opportunities: Facilitate platforms where students can apply the knowledge garnered in classrooms to real-world settings, offering a firsthand experience of the applicative value of their learning. For example, a student interested in journalism could intern at a local newspaper, applying their language arts skills in a practical setting.

    Simulations: Create environments where students can safely yet realistically explore the practical utility of the concepts learned through simulations, bringing theoretical knowledge to life. For example, a mock trial in a civics class can help students understand judicial processes firsthand.

    Implementing Interlinking Learning in middle school

    While interlinking learning offers numerous benefits, it can be challenging to balance the curriculum and find time to implement it effectively. There might also be resistance from individuals who stick to traditional teaching methods, so it's vital to showcase the benefits of interlinking learning through demonstrations and discussions to overcome this resistance. 

    When interlinking learning, start small, and gradually integrate more interlinking concepts into your lessons to foster a richer and more interconnected learning experience, perhaps by drawing parallels between subjects during a single lesson or utilizing a theme to encompass different disciplines over a week. As comfort and familiarity grow, gradually integrate more complex interlinking concepts into your lessons.  

    As we steer our students toward a future where interdisciplinary understanding is paramount, integrating interlinking learning into our educational approach becomes not just beneficial but necessary. Whether it is picking apart pumpkins through multiple classes or exploring a chosen big idea, fostering connections between subjects enriches learning and aids in the development of well-rounded individuals ready to navigate a multifaceted world. 






    Comments

    Popular posts from this blog

    Karin Slaughter discusses 'Pieces of Her' on Netflix - The Washington Post

    “There’s Nothing to Doooo” - Slate

    22 Engaging Speaking Activities For ESL Classrooms - Teaching Expertise