Categories
Uncategorized

Podcast Episode 2: What IS Computer Science?

The transcript of episode 2 of my podcast is here!

Transcript:

Hello. Welcome to how to teach computer science, the podcast. 

 This is episode two. They said it would never last. What is computer science? Is the title of this episode, the one you’ve been waiting for no need to study for three years, or even do a SKE for six weeks next summer. Just put this on, repeat for a few days and you’re done. Heh, my name is Alan Harrison. I wrote the books how to teach computer science and how to learn computer science available at many online bookstores. And you can find out more details at the companion website httcs.Online. That’s the initials of how to teach computer science. HTTCS. Dot online. I’ve got 25 episodes planned, which will take us up to the summer holidays and some fab guests booked in including drum roll, please. 

 Andrew Virnuls of advanced-ict.info. Adrienne Tough, Andy Colley, Beverly Clark and Harry and Anna Wake from mission encodable. Looking forward to inviting those fantastic people onto the podcast in a few weeks. There will be parables practice and pedagogy in this podcast. And a lot of computer science, subject knowledge and more jokes probably and anecdotes and other fun stuff like competitions and prize draws. As I was writing this script. Yes. I wrote a script. Don’t be rude. The thesaurus packed up in Microsoft word. So I have no. Thesaurus now, which is terrible. It’s also terrible. 

Oh, no. Now the dictionary is gone as well. I have no words. 

If you want to give me feedback. On that. Or anything else? Or get involved, just go to HTTCS to online or check the show notes. 

I’m also on threads Mastodon and X. mraharrison, or you can email me alan at HTTCS dot online. Remember, if you liked this content, please subscribe. Tell your friends, buy my books, leave a review on Amazon. Or at the very least buy me a coffee. At ko-fi.com/mraharrisoncs details at HTTCS dot online. The transcript will be on the blog as normal. That’s HTTCS online slash blog. 

If you are grateful for my blog, please buy my books here or buy me a coffee at ko-fi.com/mraharrisoncs, thanks!

So if you don’t like my voice. You can get your favorite text to speech engine to read out my words. Who’s this. 

Alan’s podcast is essential listening for me. I tune in every week. There is no life I know to compare to pure imagination.

Alan: That was Willy Wonka actor gene Wilder. Bye Gene. Thanks for popping in. I wonder who’ll be on the show next week? 

So let’s get today’s episode going with another fertile question . If you don’t know what that is, go back to last week’s episode. So today’s fertile question 
what is computer science? I’m now about to tell you in under 30 minutes using the TLDR sections of each chapter in the book, if you’re not terminally online, like I am, you might wonder what TL semi-colon D R stands for. It is of course too long didn’t read and you’ll see it if you dare post anything longer than a tweet on any internet forum these days, kids just don’t have the atten-. 

So here we go then. What is computer science?

 1. Data representation. 

The heart of this topic is the idea that if we can turn information into binary data, We can use a computer to process it. Digital computers process binary numbers, because they used to state electrical signals the chMastodonges, therefore to find a transformation for real-world information to binary, this transformation is called encoding and it makes use of a code. ASCII and Unicode are used to encode text. JPEG GIF [00:04:00] PNG do the same for bitmap images and wow. MP3 and AAC and code digital sound. But it’s important to realize that there are virtually limitless ways of encoding information. 

And these are just the techniques that are widely used. Oh into that effectiveness or official recognition or both. I’d love to digital conversion is the process of mapping the original data to the digital representation. And it’s vital to understand binary, to really grasp the importance of that debt resolution and their effect on file size. Metadata is data about data and describes the contents of the file or something about the original information. That’s really the fundamentals of data representation covered. 

The most important concept is we need a way of encoding information as binary. Then we’ve cracked it. 

Advertisements

Talking of binary. I made a worksheet for my class full of binary number questions then I went and guillotined the right-hand edge of all the [00:05:00] pages. Chopped off the last digit on the right, other side of all the binary numbers on the worksheet. But it didn’t matter, why? 

That’s today’s competition. 

Find my tweet, threads post or Mastodon post entitled “podcast competition” and answer this question. Why did it. Not matter that I chopped off the last digit of all my binary numbers? 

 Onwards and upwards. Let’s talk about. 

Two. Programming. 

In 1968, Donald Knuth wrote, the process of preparing programs can be an aesthetic experience, much like composing poetry or music. Thank you, Mr. Knuth Renowned Dutch computing pioneer Edsger Dijkstra is famously supposed to have said, computer science is no more about computers than astronomy is about telescopes. 

Although this isn’t on record anywhere, but he did say surgery isn’t called [00:06:00] knife science. Programming is not about devices or even key words or punctuation or indentation. Programming exists to solve problems using machine. First we find a way to state the problem computationally. Then we get a machine to perform the computation. The first part is what we now call computational thinking. It’s easily the largest part of the process, but novice programmers often forget this. 

And sometimes expert instructors do too. Programming is about using abstraction to determine inputs, processes, outputs. And deciding which variables and data structures are needed. Then using decomposition and algorithmic sinking to design an algorithm to process the data. You will need sequence selection, iteration, and sub programs, which you will combine in a structured program. Remembering to make it maintainable with meaningful white space and the use of sub programs, that’s functions and procedures to break a problem down into smaller problems that’s [00:07:00] decomposition. In all programming instruction, developing computational thinking or CT skills is where we should spend our time. And we heard last week that in all teaching, we should consider cognitive load and make sure learners are thinking hard about what matters, getting better at designing programs, using CT and not about working out where the punctuation goes. This is why when we’re teaching programming, we should do lots of code comprehension. 

We should use PRIMM. Parsons problems. Sabotaged code. Smelly code and pair programming to reduce cognitive load and I’ll be discussing all of those in a future podcast. 

So that was programming, but we need to make with our programming skill… 

Three. Robust programs. 

Early program mes designed and debugged their own programs. Building in code to prevent failures due to user error or hardware failure was pioneered by Margaret Hamilton for the Apollo space

program. Her work led to the creation of a new discipline, software engineering popularized by a NATO conference in 1968. New techniques and tools were created throughout the seventies to address the software crisis and improve software quality. Glenford Myers published the art of software testing in 1979. And the software development lifecycle was born.

Advertisements


Industry found The original waterfall development model, unresponsive to change. And iterative techniques known collectively as agile grew popular in the 1990s. Many companies began to employ test automation, software to reduce costs. Modern robust programming techniques you need to know include. Anticipating misuse through authentication, access levels, sanitization and validation. 
So let’s look at those. Authentication is keeping out unauthorized users by verifying the user’s identity usually with a [00:09:00] password more on that in the cybersecurity episode coming soon. Access levels limit what a user can do to their permitted functions. Input sanitization, such as removing spaces and punctuation prevents bad data getting in and defend against SQL injection hacks. Again, more later in the cybersecurity episode. Validation checking inputs are reasonable. For example, the birthday to the living person must be sometime in the last hundred and 50 years, right? 

Robustness also comes from structured programming techniques, focused on modular maintainable code that uses meaningful identifiers, indentation, white space and sub programs. 

Testing is also important for robustness. Iterative testing is carried out during development and final testing at the end. Black box testing means treating the code like a black box we cannot see into, instead checking each input causes expected output. White box testing, which should actually be called transparent box [00:10:00] testing, describes testing with knowledge of the code. 

For example, you might run tests that ensure every line of code is executed. Do not confuse white and black box testing with white and black hat hackers. They are not related.

Languages and IDEs. 

We need to remember that at its heart a computer is just a collection of logic circuits that process digital signals of high and low voltages representing zeros and ones. The circuits can decode patterns of zeros and ones, and we call these bit patterns. Instructions. Each CPU responds to a finite set of these low level instructions, its machine code instruction set. 

Coding in binary is difficult and error prone so each binary code is given a short, memorable name or mnemonic such as load add or branch. This assembly language is still difficult to code and contains no useful constructs, such as [00:11:00] loops or arrays. So high-level languages were invented, which are more English like, and allow us to write complex programs very quickly. Python Java JavaScript, VB.NET, ,CC plus plus, and C sharp are popular high-level languages. 

High-level code must be translated into machine code before it can be run on the CPU. For this, we need a translator. Compilers translate the whole high level source code program into machine code creating an executable file of what we call object code. Interpreters translate the program one line at a time which allows for rapid coding and debugging but slower execution than compiled code. Assembly language may still be used for small mission critical programs because machine code compiled from high level code may not be optimal. an integrated development environment or IDE is usually used to develop code. An IDE provides many features to speed up coding and debugging, such as syntax [00:12:00] checking autocomplete, stepping break points on variable tracing. As an aside, my favorite ID for beginners is now Thonny, from Thonny.org.

Algorithms.

Earlier we heard that programming is not about using the correct keywords if while and, so on but the process of solving a problem with the building blocks of code, sequence, selection, and iteration. Algorithms predate computer science by thousands of years and derive largely from mathematics and the natural sciences. Indeed the word algorithm comes from the name of a Persian scholar. Muhammad ibn MÅ«sā al-KhwārizmÄ« who worked in Baghdad in the ninth century. Some algorithms are so useful, they crop up again and again so an understanding of searching and sorting algorithms is necessary. The bubble sort algorithm passes over a list or array of data, many times repeatedly swapping, adjacent items. Insertion sort maintains a sorted and unsorted sub-list, repeatedly picking the next unsorted [00:13:00] item and placing it into the correct place in the sorted sub-list. Merge sort breaks a list down to individual elements, then recombines elements into sorted pairs, pairs into sorted fours and so on. Until the list is whole again and sorted. 

As for searching. Well, linear search just checks each item in the list until it finds the target and this works on unsorted data. If our data is sorted, we can use binary search, which repeatedly checks the middle item and discards the left or right half of the array each time, which is much quicker. As we can see two or more algorithms can be created to solve the same problem. And they will perform differently given particular inputs so it’s important to choose the right algorithm for a task. Learners must also be able to interpret an algorithm from flowcharts and pseudo code, correct errors and complete unfinished algorithms. To help with all of this, they should be able to trace an algorithm, thus driving out logic errors. 

Advertisements

Okay. Little break [00:14:00] now let’s play the high, lower game. 

 I’m thinking of a number from one to 64. Guess what it is. Oh, I’ve got a text here from a listener. I’ll text to speech it. 32. Ah, you’ve played this game before, haven’t you.
Lower? 16.
Higher. 24.
Higher. 28.
Higher 30.
Lower. 29. 
Correct. It was 29. 

Well done. Random listener on the text message there. Yes, no matter what number I choose between one and 64, you can guarantee to get it in six guesses or less. It’s one of my party tricks in the classroom, but why is this? And what has it got to do with algorithms? Well, you can message me. Just for fun. To tell me the answer or I’ll give the answer next week.

Architecture. 

Alan Turing described the concept of the stored program computer in [00:15:00] 1936, John Von Neumann built on Turing’s work explaining in 1945 how a cycle of fetch decode execute could allow the same memory to hold programs and data. Freddie Williams led a team at Manchester university that built the baby. Which ran around 700 instructions per second in 1946. Its success led to the 1951 Ferranti Mark I, the first commercial computer, for which women wrote most of the programs. Valves gateway to much faster transistors in the 1960s and this exposed the Von Neumann bottleneck solved by the Harvard architecture of separate memories for instructions and data. 

Early memory stores included paper tape, magnetic tape, magnetic drum, acoustic delay lines, and core rope memory until semiconductor RAM arrived in the 1960s. Magnetic hard disk drives provided secondary storage from the 1950s onwards with flash memory becoming popular in the 21st century for portable storage devices and solid state disks. Compact discs invented in [00:16:00] 1979 and DVDs and Blu Ray disks are examples of the third common storage type, the optical disk. Computer performance is limited by the three CS, clock speed, Cores and size of cache. 

Advertisements

From the baby to the modern smartphone, all CPUs still contain an arithmetic and logic unit or ALU, some registers and a control unit. And they perform a fetch decode execute cycle first described by Von Neumann in 1945. Talking of Von Neumann, I’m saying NOY-mann because he was Hungarian. And not to be confused with Max Newman who worked at Bletchley park with Alan Turing. So von Neumann thank you for your contribution. And inventing the arithmetic and logic unit, talking of logic. 

Boolean logic.
George Boole published his paper, in 1847. Describing what became known as Boolean algebra. Claude Shannon [00:17:00] saw how Boole’s work could be applied to electronics in 1938. The first digital computers used fragile valves and slow relays. Transistor computers arrived in the 1950s, greatly improving speed and reliability. Computer’s use a high voltage around five volts to represent either true or a binary 1. And a low voltage close to zero volts to represent False or binary zero. A transistor acts like an electronic switch, turning the voltage on or off. Transistors can be combined into logic gates. A logic gate is a collection of microscopic transistors that perform a Boolean logic operation, such as, AND, OR or NOT. Logic gates are combined into circuits, inside a computer to perform arithmetic and logical operations. An AND gate takes two inputs and produces the output one or true only if both inputs are one or true. One input AND the other. The OR gate produces an output one, if either one OR the other input is one [00:18:00] and a NOT gate inverts the output zero to one and one to zero. We use truth tables to list the outputs for every possible combination of inputs. We can write Boolean expressions, such as Q = A AND NOT B, and then draw logic circuits, connecting symbols, which represent the logic gates. 
Hey, talking of logic. Three computer scientists walk into a bar, the bartender asks, do you all want a drink? The first says, I don’t know. The second says, I don’t know. The third thinks for a minute and says, yes. If you know how that works, message me on threads, Mastodon or X. Just for fun.


 Eight.
 System software. 

Early computers were hardwired to perform a single program. Running a different program required extensive manual intervention. An IBM project called stretch in 1961 and Manchester’s Atlas computer in 1962. Provided multi programming features for the first [00:19:00] time. In 1964. IBM’s 360 delivered index data files, program libraries, a job scheduler, interrupt, handling, and print spooling. 

The modern operating system was thus born Two Bell labs, researchers, Ken Thompson and Dennis Ritchie created Unix in 1971, which became the most popular OS on the planet by 1980. Apple’s 1984 Macintosh was the world’s first successful home computer with a graphical user interface. And a year later, bill gates, Microsoft released its first GUI called windows. Mobile versions were spun off in the 21st century, including iOS and the windows Phone OS. Which is now dead. The Finnish student. Linus Torvalds released the first version of Linux in 1991 and it now runs hundreds of millions of devices from home internet routers to Amazon’s data center servers. Linux is open source, meaning anyone can see copy amend and contribute to the source code. OSs are a [00:20:00] type of system software that exists to manage the hardware and to allow applications and users to interact with and control the system by managing memory, CPU time slices and input and output. 

Utilities and drivers are also system software. Utilities help keep the computer running smoothly while drivers communicate with the hardware. Anything, that’s not an application is probably system software.

Nine.
Networks. 
 Let’s quickly look back at the creation of the internet. 
The internet, is that thing still around?
And then the worldwide web. In the 1960s computers on university campuses like UCLA would join together in a local area network or LAN. Then in 1969, the first wide area network or WAN was created between UCLA and Stanford as part of the ARPANET project. Early routers called interface message processors or IMPs performed packet, switching the process of breaking up data into chunks and routing it [00:21:00] across a network with the packets potentially taking different routes. And being reassembled at the other end. This was a key strength of the ARPANET, allowing it to grow quickly and perform reliably. In 1983. The ARPANET adopted a set of standard protocols created by Vint Cerf called TCP IP. Protocols are rules that enable very different computers to communicate. The protocols are arranged in layers with each layer, performing a single job. 

At the top is the application layer where email sits and later websites displayed by the browser. Throughout the 1980s, the internet was used mostly by universities and the military to access text only services like email, FTP, and use net. Home users arrived on the internet in the early 1990s. Thanks to the first commercial ISP, including AOL and CompuServe. 

True story. I sent my first email in 1986 from Sheffield university to my friend at Newcastle university. But within weeks of starting, my course, our email [00:22:00] access was removed because we crashed the server with chain emails, full of ASCII cows. Google ASCII cows and thank me later. Tim Berners Lee combined HTML with TCP IP to create the worldwide web in 1993. This technology allows a browser to download and display pages from a web server anywhere in the world. The web has grown rapidly and around 66%. of the world’s population is now online. 

 It’s important to draw a clear distinction between the internet and the worldwide web. The internet is a global network of cables, satellite links, switches, and routers that join computers together. The web is the collection of websites, apps, and services that make use of the internet to do useful things. 

Advertisements

Alan: Computer networks can be LANs consisting of switches, wireless access points, and Ethernet cables or WANs which use copper, fiber optic, microwave and satellite links to join devices over long [00:23:00] distances. a Router connects a LAN to another LAN or a WAN. The router in your home is actually a multifunction device containing a switch, wireless access point, router and modem. Right. So what’s next. Oh, I think there’s another celebrity guest.

Cybersecurity. I don’t know much about that. Don’t worry, Peter. I do .
Ten.
security. 
Keeping secrets is as old as writing messages. Julius Caesar is said to have encrypted his messages by shifting each letter down the alphabet by a known shift key. An encryption method that changes each letter for another letter or symbol, like this is called a substitution cipher. These are easily broken by frequency analysis. First documented by the ninth century Persian scholar Al Kindi. 
During the second world war, the Nazis used electromechanical machines called enigma and Lorenz. Which were cracked by expert mathematicians working with early computers at the UKs Bletchley park [00:24:00] code, breaking center. Modern encryption uses mathematical methods to ensure that computers cannot brute force the key. Verifying the identity of a user is called authentication. Passwords are the most common means of authentication. But a weak password can easily be brute forced by trying all possible combinations. Passwords can also be guessed or spotted while shoulder surfing. The second layer of protection is added by two factor authentication or 2FA. Typically 2FA requires a code, delivered by text message or generated by a token or app. Or a biometric indicator such as a fingerprint or face recognition. Attacks on the network include distributed denial of service, DDoS, and hacking attempts. Firewalls at the network perimeter will keep out unwanted network, traffic and website should be protected against SQL injection attacks by sanitizing their inputs as we discussed earlier. 

Malicious software or malware, consists of viruses, Trojans, and worms. Antivirus or more accurately. [00:25:00] Anti-malware software can help, but other security measures such as patching firewalls and user training are vital. Social engineering is often called hacking the human and includes phishing pretexting and shoulder surfing. For any company, educating users is important and this should be part of the network security policy. Finally defensive design means designing systems to be secure in the first place. This can include secure network design code reviews, testing, and anticipating misuse. 

As we discussed earlier, robust programming. And security, thus go hand-in-hand and are linked to many of the topics in the final chapter.
 How are we doing for time? 
Alan: I did say I do this in 30 minutes. So I’m going to have to speed this one up. Okay. 
Eleven.
 Issues and impacts. 

Information technology caused a third industrial revolution and analysts are calling the convergence of mobile internet automation and AI the fourth industrial revolution. [00:26:00] With all new technology comes both opportunities and challenges. We face privacy, legal, cultural, environmental, and ethical questions, and many issues span two or more of those categories, such as automation, equality, bias in decision-making and the future of work. Decisions require us to balance competing issues and impacts. 

For example, automation drives down the cost of production and eliminates hazardous occupations, but can cut jobs or worsen inequality. The internet has opened up communications previously impossible, but has created a digital divide between those, with access and those without. Artificial intelligence is revolutionizing manufacturing, healthcare transport and the arts. But it suffers from bias, discrimination and lack of transparency. Cryptocurrencies such as Bitcoin have been criticized for their energy use and electronic waste is a growing ethical, environmental and legal issue. While finite resources needed in smartphones are mined by low paid workers in exploitative practices.

In every question about issues and impacts of technology we must consider all the stakeholders involved, including the creators vendors, shareholders, consumers, and wider society and balance their often competing interests. 

Advertisements


How have I done for time?. 

Wow. That was a whistle-stop tour of the GCSE in computer science. So now you’re ready to sit the exam. Or to teach the subject So let’s revisit our fertile question. 
What is computer science? Have I answered it? Let me know in the comments or on the socials. This has been how to teach computer science, the podcast I’m Alan Harrison. 

If you want to give me feedback or get involved, just go to HTTCS online or check the show notes. I’m also on threads Mastodon an X as mraharrison, remember, if you liked this content, please subscribe. Tell your [00:28:00] friends, buy my books, leave a review of my books on Amazon, or at the very least buy me a coffee. I’m also available for staff training INSET days and student masterclasses see the website for details. Next week, I have a guest, the amazing Andrew Virnuls, who like me sat the old computer studies O-level in the eighties. And worked in IT for decades. So we’ll be catching up and discussing in more detail one of my favorite topics, data representation, 

 I’m off for a cappuccino paid for by our listener mark Weddell. Thanks mark. Really appreciated. If you enjoyed this, why not do the same? Don’t forget to hit subscribe and I’ll see you next week.

If you are grateful for my blog, please buy my books here or buy me a coffee at ko-fi.com/mraharrisoncs, thanks!

By mraharrisoncs

Freelance consultant, teacher and author, professional development lead for the NCCE, CAS Master Teacher, Computer Science lecturer.

Leave a comment