What Kind Of Computers Are Used At NASA To Explore The Universe?

While everyone is obsessed with the pictures of the universe taken through NASA’s James Webb Telescope,

Only the tech-savvy ask the question: what kind of super-powerful computers does NASA use to explore the universe?

At first, one might think it must be massive supercomputers that cost millions, maybe billions,

Well, that might not be the case.

A few years ago, we saw this picture.

The pictures date back to 2019 when NASA unveiled the Event Horizon Telescope Project’s first clear picture of a blackhole.

Processed on a Macbook Pro?!

Yes, the Apple laptop is sold from $1299 to regular people.

But is that true?

Well,

What Is NASA

While NASA is committed to cosmological research, office politics are what gave rise to the organization. 

The National Aeronautics and Space Act (NASA) was created to “provide for research into problems of flight within and outside the earth’s atmosphere, and for other purposes” and was signed into law on July 29, 1958. One of those additional goals was “to overcome the interservice rivalries that had confused the U.S. missile and space programs”.

That’s history. Today, NASA is the biggest space agency worldwide, carrying out high-profile space missions, such as the first picture of a blackhole, the surface of Pluto, the moon landing, and most recently, the James Webb Telescope images of the universe as never seen before.

Now, back to computers.

What Computers Do NASA Use?

IBM 7090/7094

NASA sought the assistance of the International Business Machines Corporation (IBM). At the time, the Mercury and Gemini programs’ mission control and data management were powered by IBM mainframe computers from the 7090 series.

The mainframe IBM 7090 was replaced by the IBM 7094. In order to keep up with the expanding scientific workloads in the 1960s, IBM notes that the advanced solid-state IBM 7094 provided significant increases in internal operating speeds and functional capacities. Depending on the particular application, the powerful IBM 7094 had an internal processing speed that was between 1.4 and 2.4 times faster than the 7090. 

It is noted that as advanced as the 7094 was, it was only as fast as a personal computer of the late 1980s.

Increased demand for memory and processing, like with many early computing technologies, compelled organizations to find more powerful solutions. The System 360 series was introduced by IBM in 1964, and according to NASA, it was “a compatible line of several computers of different sizes using a new multiprocessing operating system that owed some of its characteristics to the company’s NASA experiences.” 

Although the first three Apollo (unmanned) missions were supported by the four remaining 7094 computers, IBM used the first 360 replacements to start developing software for the Apollo lunar missions, according to NASA.

The Apollo Guidance Computer (AGC)

It is important to keep in mind that many of the AGC’s features were either created specifically for the project or modified for it.

DSKY was invented to assist astronauts in using the computer, and the core rope memory system was only recently developed.

DSKY required astronauts to input commands into the interface using two-digit commands, with one digit standing in for a “verb” and the other for a “noun.” Nouns referred to the data that would be affected by the action, while verbs described the actions the computer was to carry out. The Inertial Measurement Unit would use a sextant to align the craft after receiving information from the DSKY in the lower instrument bay of the spacecraft.

Today

Which computer does NASA use? Because there are computers used for space travel on the actual spacecraft and computers used back on Earth for guidance and calculating the algorithms that keep them up there, there are basically two answers to this question.

The Pleiades Supercomputer is the monster in charge of simulating the scenarios for launching people into space from our massive rock, with the goal of keeping them up there long enough to carry out their tasks and return safely to solid ground.

This beast

However, workstations and laptops are fairly typical devices you would find in any governmental setting. In reality, the NASA Advanced Supercomputing (NAS) Division was the first to connect supercomputers and workstations, enabling the distribution of computation and visualization—what we now refer to as client/server. 

Scientists and engineers can always request a platform of their choosing, and Macs are frequently selected. You don’t have to install a program specifically for X11 applications since all the users of Macs use the Mac for everything and emulate Windows versions when needed.

According to Robert Frost, Instructor and Flight Controller at NASA, most Apple computers can be found at the more study-driven centers but not at operations-oriented centers.

HP, IBM, and Dell produce the workstations at NASA facilities and the laptops used on the ISS. The IBM ThinkPad is primarily used on space shuttles and is approved for use after passing tests for off-gassing, radiation, thermal stability, fire, and fire suppression.

Conclusion

While NASA landed people on the moon, your dishwasher is smarter than the Apollo Guidance Computer.

They rely on innovative software and systems running on commercial-grade devices to explore the universe. In the 60s, Margaret Hamilton developed the code for the Apollo Mission, using mathematics and physics. In fact, the term “software engineer” is thought to have been coined by her.

So next time you think your brand new computer might make you a NASA scientist, remember that humans went to the moon, not computers.

Can Humans Successfully Cultivate A Life In Space?

Every millennial’s dream is to get away from the city and start a quiet life in the suburbs.

What about starting a new life in space?

Is there life on Mars? Is there life on the moon?

Is it possible to cultivate crops and have drinking water there?

What is this process called?

Terraforming

The process of altering a planet, moon, or other body to a more hospitable atmosphere, temperature, or ecology is known as “earth-shaping.”

Renowned astronomer Carl Sagan proposed using planetary engineering techniques to change Venus in a 1961 article that appeared in the journal Science.

This involves sprinkling algae into Venus’ atmosphere in order to convert the planet’s abundant nitrogen, carbon dioxide, and water into organic compounds and lessen the runaway greenhouse effect.

To make the polar ice caps melt faster and create more “Earth-like conditions,” these measures included bringing low-albedo material and/or planting dark plants there.

In 1976, NASA made official mention of the topic of planetary engineering. The study came to the conclusion that a warmer, oxygen and ozone-rich atmosphere could be produced by photosynthetic organisms, melting of the polar ice caps, and the release of greenhouse gases. In the same year, the first conference session on terraforming, then known as “Planetary Modeling,” was organized.

The First Terraforming Colloquium was organized in March 1979 as a special session at the tenth Lunar and Planetary Science Conference, which is held annually in Houston, Texas.

They explored the potential for a self-regulating Martian biosphere within it, including the necessary procedures. The term “terraforming” was used in the title of a published article for the first time; it would become the preferred term.

Chlorofluorocarbons (CFCs) are added to the atmosphere to cause global warming.

The biophysicist Robert Haynes was inspired by this book to start endorsing terraforming as a component of ecopoiesis.

This term describes the starting point of an ecosystem and is derived from the Greek words oikos (“house”) and poiesis (“production”). It includes a type of planetary engineering in the context of space exploration, where a viable ecosystem is created on an otherwise lifeless planet.

This starts with microbial life being introduced to a planet, which creates conditions resembling those of a prehistoric Earth. The importation of plant life, which speeds up the production of oxygen, comes next, and is followed by the introduction of animal life.

Kenneth Roy investigated the idea of enclosing an alien world in a substantial “shell” to keep its atmosphere contained while allowing long-term changes to take hold.

Another concept is “paraterraforming,” which involves sealing a useful portion of a planet inside a dome in order to change its surroundings.The worldhouse concept could be used to terraform areas of a variety of planets that are otherwise inhospitable or cannot be completely changed.

Potential Sites To Start A New Life

Mercury

Mercury’s surface is largely inhospitable to life, with temperatures that alternate between being extremely hot and being extremely cold. 

the northern polar region’s presence of organic molecules and water ice. Consequently, colonies could be built in the regions, and paraterraforming would be possible. 

The northern region could be changed to accommodate human habitation if domes of sufficient size could be constructed over the craters. 

This could be achieved by directing sunlight into the domes through mirrors, gradually raising the temperature. After the water ice melted, soil could be created by combining it with organic compounds and finely ground sand. Once oxygen was produced by plants and combined with nitrogen gas, a breathable atmosphere was created.

Venus

Venus’ atmosphere would need to undergo chemical reactions in order to become breathable and lose some of its density.

In one scenario, atmospheric CO2 would be converted into graphite and water by the introduction of hydrogen and iron aerosol. Due to Venus’s relatively constant elevation, this water would eventually fall to the planet’s surface, where it would cover about 80% of the surface.

Speeding up Venus’ rotation has been proposed as a potential terraforming technique. It may start to produce a stronger magnetic field if it could be spun up to the point where its day-night cycle is similar to Earth’s.

By preventing as much solar wind and corresponding radiation from reaching the surface, this would make it safer for terrestrial organisms.

The Moon

The Moon is the closest celestial body to Earth, so colonizing it would be relatively simple in comparison to other bodies. 

But when it comes to terraforming the Moon, the possibilities and difficulties are very similar to those of Mercury because the Moon has a very thin atmosphere and a dearth of the elements needed for life—hydrogen, nitrogen, and carbon. 

By capturing comets that have water ice and volatiles and crashing them into the surface, these issues could be solved. These gases and water vapor would be dispersed by the sublimating comets to form an atmosphere. The water trapped in the lunar regolith would also be released by these impacts, and this water could eventually build up on the surface to form natural bodies of water.

The Moon’s rotation would speed up as a result of the transfer of momentum from these comets, and it would no longer be tidally locked. A moon with a stable day-night cycle would be easier to colonize and adapt to life on the moon if it rotated once on its axis every 24 hours.

Mars

Mars is the most frequently selected location for terraforming. Its proximity to Earth, its similarities to Earth, and the fact that it once had an environment very similar to Earth’s — one with a thicker atmosphere and warm, flowing water on the surface — are a few of the factors contributing to this. And finally, it’s thought that Mars may have more underground water sources than is currently known. 

In essence, the day-night and seasonal cycles on Mars are very similar to those on Earth. In the latter scenario, Mars experiences seasonal changes that are strikingly similar to those on Earth because of its similarly tilted axis (25.19° compared to 23°).

Beyond these, Mars would need to undergo significant changes before humans could live there. It would be necessary to drastically thicken the atmosphere and alter its composition. At present, 96% of Mars’ atmosphere is carbon dioxide, 1.93% is argon, and 1.89% is nitrogen, and its air pressure is only 1% that of Earth at sea level. 

Above all, Mars is devoid of a magnetosphere, which results in a significantly higher radiation level on its surface than is typical for Earth.

By bombarding the planet’s polar regions with meteors, Mars’ atmosphere could be thickened and the surface warmed. This would result in the melting of the poles, the release of the water and carbon dioxide that had been frozen there, and the ensuing greenhouse effect.

Ammonia and methane, two volatile substances, could be introduced into the atmosphere, thickening it and causing warming. Both could be mined from the icy moons of the outer Solar System, especially from the moons of Titan, Ganymede, and Callisto.

After colliding with the surface, the ammonia ice would sublimate and decompose into hydrogen and nitrogen, with the nitrogen serving as a buffer gas. The hydrogen would then interact with the CO2 to form water and graphite. Meanwhile, the methane would act as a greenhouse gas, accelerating global warming. The impacts would also release large amounts of dust into the atmosphere, accelerating the trend toward global warming.

Another idea is to create orbital mirrors that would be positioned close to the poles and focus sunlight on the surface to start a warming cycle by causing the polar ice caps to melt and release CO2 gas. It has also been proposed to use the dark dust from Phobos and Deimos to lower the surface’s albedo, allowing it to absorb more sunlight.

In summary, terraforming Mars has a lot of options. And many of them are at least on the table, if not immediately accessible.

Elon Musk and Stephen Hawking have argued that humanity needs a “backup location” to ensure its survival.

Mars One has recruited thousands of volunteers to help colonize the Red Planet. NASA, the ESA, and China have all expressed interest in the possibility of long-term habitability on Mars or the Moon. From all appearances, terraforming appears to be yet another science-fiction idea that is moving closer to becoming a reality.

Even with an unbelievable price tag and many advanced technologies, this one day might become a reality.

Just like in the David Bowie song, do you believe there’s life on Mars?

What Technologies Are Used To Make Space Telescopes?

Outer space, the dream of every child! The sun, the stars, the moon.

These skylights at night

How do we see them?

Space telescopes!

But how do they work?

Let’s find out!

What Are Space Telescopes?

Telescopes have been used by astronomers for hundreds of years to observe the night sky. The Dutch eyeglass maker, Hans Lippershey, invented the first telescope in 1608.

A device that could magnify things three times. His telescope had a convex objective lens and a concave eyepiece. According to a story, he was inspired to create his invention when he saw two kids in his shop holding up two lenses to make a faraway weather vane appear closer.

Modern telescopes must be in space to obtain the most accurate data!

A space telescope is extremely difficult to construct and operate. Additionally, it is very expensive. Only since the 1980s has it been feasible. The Hubble Space Telescope is a noteworthy space telescope. Over 1 million images have been captured since it started observing in 1990.

Numerous additional space telescopes have been launched since 1990. Some are like Chandra, which studies x-rays, and Fermi, which studies gamma rays. Microwaves or infrared can be seen by others. This has altered our perception of the universe. In December 2021, the James Webb Space Telescope (JWST)—the largest—was put into orbit.

The JWST is the space telescope that replaced the Hubble (HST). The JWST can see smaller details and fainter objects because its mirror is larger than the HST’s. It is also capable of observing light at longer wavelengths than the HST.

JWST captured its first picture of a star and a selfie in February 2022. The scientists completely aligned and focused the telescope on a star in March 2022. On July 12, 2022, the first batch of data from the fully calibrated telescope was made public. The information contained an image of one of the smallest infrared objects ever seen as well as proof of water in an exoplanet’s atmosphere!

JWST was built to conduct research into the early universe.

What were the initial stars and galaxies like?

How do galaxies and stars change over time?

Does Earth orbit any other stars besides our own?

Can we discover anything about dark energy and dark matter?

The JWST will gather data during its mission, which is expected to last between five and ten years, in order to assist scientists in resolving these issues. Plus a slew of others!

A total of 6,200 kg makes up the weight of the telescope, which is similar in size to an African bush elephant.

The enormous mirror on the JWST is 6.5 meters in diameter and collects light from space. 18 hexagonal segments make up its structure.

These spread out and positioned themselves appropriately in the air to create a single, enormous mirror.

Additionally, JWST has a sunshield the size of a tennis court. This prevents the Sun’s light (as well as light reflected off the Earth and Moon) from heating up the detectors. To function, the detectors need to be kept at a low 266 °C.

What Technologies Are Used In These Space Telescopes?

The enormous telescope is located 907,530 miles from Earth in the orbit of Lagrange Point 2 (L2). Due to a unique location at this distance, JWST can maintain alignment with the Earth as it revolves around the Sun. It is currently maintaining this orbit by moving at a speed of about 451 miles per hour.

The journey of the telescope, 1,500,000 kilometers from Earth, took about 29 days.

The telescope won’t move around too much because of the gravitational pull of the Sun and the Earth. It only requires tiny, infrequent rocket thrusts to keep its orbit at L2. The JWST will follow the Earth as it orbits the Sun because of this L2 location.

The James Webb Telescope’s near-infrared camera (NIRCam) is the primary imager being used.

For the purpose of detecting light with infrared wavelengths between 0.6 and 6 microns, a team from the University of Arizona created the NIRCam.

This imager’s mission is to search for stars in the far reaches of the Kuiper Belt as well as light coming from galaxies and stars that are still in the early stages of formation.

The NIRCam has a large number of coronagraphs to image challenging objects. These devices deflect light from brighter objects, enabling imaging of nearby fainter and dimmer objects.

On a more fundamental level, the NIRCam has ten mercury-cadmium-telluride (HgCdTe) detector arrays that function similarly to charge-coupled devices (CCDs) in traditional digital cameras in terms of image capture.

The NIRCam is made up of two nearly identical, completely redundant modules that each point to different fields of view in the sky. The two modules can be used concurrently and can simultaneously observe the short wavelength channel (0.6-2.3 m) and the long wavelength channel (2.4-5.0 m).

However, the near-infrared spectrograph is more impressive (NIRSpec). A multi-object near-infrared spectrograph, or NIRSpec, is a tool that dissects incident light into its individual color components for more in-depth analysis. The James Webb Telescope uses NIRSpec, a near-infrared wavelength detector, to examine distant galaxies’ chemical composition and star formation processes.

The device’s intelligent microshutter assembly, a feature that enables NIRSpec to observe over 100 objects simultaneously, is one of the technology’s most impressive achievements.

The four 1.5-inch squares that make up the microshutter assembly each contain a separate array of 62,000 microscopic shutters. The shutters are 100 x 200 mm rectangles that selectively open and close, only catching light coming from an object that the NIRSPec’s detector has been trained on. From this point, the light is concentrated into a single, bright point that produces the sharpest image.

The shutter doors of the arrays are lined with magnetic strips that are contained within an electrically charged metal box, and the arrays themselves are made of silicon nitride wafers. Applying an electric charge and corresponding magnetic polarization to each shutter causes them to physically open and close.

When the doors are all open, a magnet sweeps over the shutters, repelling the magnetic strips on the doors. The desired shutters are then given voltage by electronic controllers, which causes them to change their magnetic polarization and close the doors when the magnet passes by again.

The microshutter system can make sure that the only doors left open are those that are in alignment with the object that is intended to be observed in this way.

In its brief time of operation, the James Webb Telescope has already generated some extraordinary images, entirely as a result of the cutting-edge imaging technology installed.

The imaging technology developed by the telescope team ranks among the best that humanity has ever created.

The James Webb Space Telescope’s first science images were officially released by NASA during a live event on July 12, 2022. They included the Cosmic Cliffs in the Carina Nebula, the striking Southern Ring Nebula, Stephan’s Quintet, and an analysis of the atmospheric composition of the hot gas giant exoplanet WASP-96 b.

According to a NASA press release, the breathtaking image was produced with just 12.5 hours of observing time on one of the telescope’s four instruments. It shows the deepest infrared view of the universe to date.

One of the mirror segments is shining brighter than the others in the “selfie” because it was the only one that had been successfully aligned and was pointing at a star at the time. One by one, the remaining mirror segments were successfully aligned.

What Can We Expect From JWST?

The four main areas of the JWST’s science mandate are:

First Light and Reionization

This refers to the period of time immediately following the Big Bang, which created the universe as we know it today.

The universe was initially a sea of particles after the Big Bang, and light was not visible until the universe cooled enough for these particles to start combining. The “epoch of reionization” refers to the period after the first stars formed, when neutral hydrogen was reionized by radiation from these first stars.

The Assembly Of Galaxies

By observing how matter is arranged on enormous scales, such as in galaxies, we can learn more about how the universe came into being. One of the objectives of JWST is to look back at the earliest galaxies to better understand how the spiral and elliptical galaxies we see today evolved from various shapes over billions of years.

Additionally, researchers are attempting to understand how galaxies formed and assembled in the past, as well as how we came to have the variety of galaxies that are visible today.

Birth Of Stars And Protoplanetary Systems

One of the most famous places for star formation is the “Pillars of Creation” of the Eagle Nebula. Stars form in gas clouds, and as they get bigger, their radiation pressure blows the gas away that was surrounding them.

However, looking inside the gas is challenging. JWST’s infrared eyes will be able to observe heat sources, such as stars developing in these star-forming cocoons.

Numerous exoplanets have been found in the last ten years, including ones using NASA’s planet-hunting Kepler Space Telescope.

Planets And The Origins Of Life

JWST’s powerful sensors will be able to take a closer look at these planets, possibly even imaging their atmospheres in some cases.

Scientists may be better able to predict whether or not a particular planet is habitable if they have a better understanding of the atmospheres and conditions under which planets form.

The James Webb Space Telescope is another milestone in the path of exploring the universe. We do not fully know what the future holds, but we are pretty sure it’s going to be bright and exciting.

How Are Computer Chips Made In 2022?

Is it true that computer chips are made of sand?

How can something complex like the digital world be built from one of the most widely available materials?

For a moment, imagine a world without microchips. From cars to smartphones, MRI scanners to industrial robots and data centers, they are at the core of the devices we use for work, travel, fitness, and entertainment.

But what are they made of?

Let’s find out! 

What’s A Microchip?

A microchip is a collection of electrical circuits on a tiny flat piece of silicon, also known as a chip, computer chip, integrated circuit, or IC.

Transistors act as tiny electrical switches on the chip that can turn a current on or off. A multilayered grid of linked forms is formed on the silicon wafer to create the pattern of microscopic switches.

There are microchips everywhere. More than 932 billion chips were produced globally in 2020, supporting a $440 billion market. Advances in chips have led to the creation of new products and the transformation of whole industries by offering higher performance, additional functionality, and reduced costs with each generation.

Gold Mine

The chip industry prefers silicon as its material. In contrast to the metals that are often used to conduct electrical currents, silicon is a “semiconductor,” which means that its conductive characteristics may be improved by combining it with other substances like boron or phosphorus. Thanks to this, an electrical current may now be turned on or off.

 It’s present almost everywhere, which is wonderful news! The second most common element on earth, after oxygen, is silicon, which is derived from sand. Silica sand, which is composed of silicon dioxide, is the type of sand used to make silicon wafers. An “ingot” is a huge cylinder that is created by melting sand. Later, tiny wafers made from this ingot are cut from it.

Powering Your Devices

The remarkable growth in processing power and memory capacity that has enabled technology to evolve to where it is now is the result of chip advances.

Thanks to semiconductors, computer power grew one trillion times between 1956 and 2015. Consider this: The Nintendo console’s computer was only about half as powerful as the one that guided the Apollo flights to the moon.

It contained 589.824 bits of read-only memory (ROM) and 32.768 bits of random access memory (RAM). The processing power of a contemporary smartphone is around 100,000 times more, and it has seven million times more ROM and a million times more RAM.

Chips support algorithms like those used in deep learning and allow applications like virtual reality and on-device artificial intelligence (AI), as well as improvements in data throughput like 5G connections. 

A lot of data is generated by all this computation. The globe will produce 175 zettabytes (ZB) of data by 2025, which is roughly equal to one billion terabytes (TB).

The Semiconductor War

The United States has its sights set on Taiwan, a leader in the semiconductor industry that China has threatened to annex with force, amid the conflict between Russia and Ukraine.

The Biden administration has placed limitations on the sale of a number of cutting-edge computer chips to Russia and China because they can be used for military applications, according to US technology company Nvidia.

The USD 280 billion CHIPS and Science Act of 2022 is being put into effect thanks to an executive order signed by US President Joe Biden. To combat China’s expanding technical influence, this includes more than USD 52 billion in subsidies for US semiconductor producers.

China has rejected the new US chip law that seeks to assist regional semiconductor firms. According to Wang Wenbin, a spokeswoman for the Chinese Foreign Ministry, the US’s new Chips and Science Act will upset global supply chains and hinder commerce.

How Did Video Gaming Develop Through The Last Decade?

With The Last of Us Remastered sparking conversations everywhere, one can only think of how advanced video games and gaming consoles have developed and grown in popularity over the years.

Video games are played by almost 3 billion people globally.

Many of us have downloaded or played video games at some point in our lives, whether it be for the purpose of conquest or collecting Pokemon. According to estimates from 2021, there were 2.96 billion games worldwide. The number of people who play games on PCs, consoles, tablets, and smartphones is predicted to surpass 3 billion this year and reach 3.32 billion by 2024.

But how did video games develop over the last decade?

Let’s find out!

Twitch

One can attribute a major shift in how people game to the introduction of Twitch.

Launched in June 2011, Twitch lets users make their own video streams, and it quickly gained a ton of popularity. Video games like the multiplayer strategy game League of Legends, the block-building adventure Minecraft, and the battle royale shooter Fortnite have all gained popularity on Twitch as a result of live competitive streaming. 

Amazon acquired Twitch for $1 billion in 2015. It regularly welcomes more than 15 million daily active users and has already given rise to five annual Twitchcon fan events that may one day compete with ComicCon.

Nintendo Switch

It followed the Nintendo Wii U’s colossal failure, which many believed would be the Japanese company’s downfall as it failed to provide appealing console hardware for the first time in decades.

Even if you don’t like Nintendo’s platform, you can’t dispute how much of an impact it has had on the gaming industry, whether it be due to its technological advancements or its never-ending collection of titles that never cease to amaze.

Along with The Witcher 3, Breath of the Wild is considered one of the all-time greats in the open-world gaming genre.

The excitement and freedom of switching between docked and portable mode cannot be compared with other consoles; instead, it seems like a work of technical magic.

The Nintendo Switch’s hardware supremacy has recently become more evident; the basic ideas behind the device are ageless and will definitely inspire all future consoles produced by the company.

Narrative Games

With games that employ cinematic and heavily plotted elements like Telltale’s The Walking Dead, Naughty Dog’s The Last of Us, and The Dark Pictures Anthology Series.

An adored trend that started with the critics’ and audience’s acclaimed Heavy Rain. These games rely heavily on the player’s emotional journey and interacting with a deep world of relationships and emotions.

While narrative games have existed for 30 years, the last decade witnessed the move from being small titles towards becoming mainstream and extremely popular all-around gaming experiences.

Mobile Gaming

Despite the fact that mobile games have been present since the first Nokia cellphones—talk about Snake—they truly came into attention after the debut of the iOs App Store in 2007. The in-app purchase feature that Apple first launched in 2009 is what set the tone for mobile games for the next ten years.

The popularity of gaming on mobile devices has skyrocketed.

  • The location-based augmented reality mobile game Pokemon Go by Niantic Inc. reached $3 billion in lifetime earnings.
  • In its first four months on the App Store and Google Play, Tencent Games’ multiplayer PUBG Mobile game surpassed 100 million downloads.
  • With 129.3 downloads in the first 30 days of release and over $37.4 million in player spending since September, Mario Kart Tour was the biggest mobile launch for Nintendo in the first month.
  • Call of Duty Mobile overtook it as the second-most successful mobile launch of all time after receiving 148 million downloads in its first month.

The past decade has seen massive tech advancements in mobile device development and groundbreaking profits from cellphone game sales.

The future looks promising with the release of Sony’s PlayStation 5 and its sales frenzy, when it reached 4x its retail price around Christmas. And all the insane time people spent on gaming during the pandemic. One can be sure that video games are breaking through as a worldwide sport really soon and are one of the most profitable entertainment sectors.

 The Complete Guide To Computer Mouse History In 3 Minutes

Have you ever wondered how the mouse you use every day works?

Let’s find out!

What Do Computer Mouses Do? 

A computer mouse is a tool for graphical user interfaces that controls the cursor. It can move, select, point, and be used for other things.

History

Douglas Engelbart invented the first computer mouse in the early 1960s. At that time, he was the head of Stanford Research Institute’s (SRI) Augmentation Research Center in Menlo Park, California.

The mouse was only a small component of a much broader research program that was initiated in 1962 with the intention of improving human intelligence.

By the time Engelbart created the mouse, he had already been looking into methods to help people become better at solving complex issues for about a decade.

After having the concept, Engelbart hired Bill English, who was already employed by SRI in another lab, to develop the hardware. Later, Jeff Rulifson joined the team, greatly improving the software’s overall quality.

Doug Engelbart and Bill English envisioned the use of computer-aided working stations to support problem-solvers in their attempts. They needed to be able to manipulate information displays by moving a pointer around the screen using a device.

The light pen, joysticks, and other gadgets were among the ones that were being used at the time or were being evaluated for use. The greatest and most effective gadget, however, was what Engelbart and English were searching for.

The first computer mouse prototype was created in 1964 for use with a graphical user interface (GUI), or “windows”. And it cost $300 to buy at that time, which is equal to $2,866 in 2022, adjusted for inflation.

They approached NASA in 1966, and with NASA money, the team created several tasks and timed volunteers performing them while using different equipment. For instance, the computer may produce a pointer in one location and an item at a random location on the screen. The amount of time it took users to move the pointer to the item was timed.

It wasn’t long before it was obvious that the mouse was superior to the others. By continually asking the user to pick up the pointer and reach all the way to the screen, devices like the light pen just require too much time.

The cord was initially located at the front of the mouse, but it was quickly shifted to the back to clear the way. It was a straightforward mechanical device with two bottom-mounted discs positioned perpendicularly. To create absolutely straight horizontal or vertical lines, you might rock or tilt the mouse.

Not long after the mouse received wheels, Bill English later changed the wheels to a ball, giving it its present-day form. The mouse may go in any direction thanks to the ball.

Compared to Douglas’ design, this makes the mouse considerably simpler to operate. At the time, Bill was employed at Xerox Alto, a tech firm. His enhanced plan assisted the business in developing the first mouse-compatible computer. It quickly achieved great success.

Users must often clean mechanical computer mouses since they become dirty over time. Because of this, work on finding a better solution started right away. The advent of optical mouses, maybe around 1980, marked the beginning of a new age.

However, optical mouses didn’t really take off until they were made commercially available to a large audience in at least 1998, when mechanical mouses started to lose favor.

The MX 1000, a mouse from Logitech, is the first optical tracking mouse to use a laser. Compared to the prior LED-based optical tracking technique, Agilent’s laser tracking technology offers a great deal more precision.

In September 2003, when it unveiled the Apple Wireless Keyboard and Apple Wireless Mouse, Apple became the first company to integrate operating system support for Bluetooth keyboard and mouse input devices. 

The market for Bluetooth keyboards and mouses lagged behind Apple for many years, choosing instead to use proprietary RF transmitter/receiver dongles rather than Bluetooth.

Like the history of the contemporary computer, the history of the computer mouse is one of invention and creativity, and optical mouses are not the end of the narrative. In the modern era, research is being done to develop ergonomic computer mouses that are not only simple to use but also beneficial to your health, something that earlier mouse designs could not offer.

Ancient Technology & History

Ancient examples of technology

Archeologists have discovered ancient technologies which have left scientists awestruck with their precision and efficiency. A few of these inventions are groundbreaking to an extent that they seem impossible to be recreated in modern times with modern technologies. These technologies shed light on the previous socio-cultural aspects of our ancestral societies. 

Antikythera mechanism

This ancient technology was discovered on the Greek island, Antikythera in 1901 by divers who were in search of sponges. The Antikythera mechanism, a 2000-year-old analog computer, had functional gears and it resembled a clock. It is discovered that the ancient invention was also utilized for transportation as it could “replicate the motions of the heavens.” This technology was not any larger in size than mantel clocks at our homes. Unlike displaying time like our modern-day clocks, the Antikythera mechanism used to display “celestial time” by using celestial bodies are guides.

Automatic doors

Automated doors came under the spotlight in 1931 and were considered as symbols of the “modern age”. However, this invention had already been made by the Greeks in 1 AD.

Brass vessels were used as a place to light a fire which generated enough heat to create a buildup of atmospheric pressure to open the door. This vessel used to simultaneously pump water in attached containers which would then act as weights to ensure that the door was kept open. These doors had a functioning similar to that of a hydraulic system.

These ancient doors, however, were not efficient in terms of time, rigid in use, and not user-friendly. Hence were not as popular as automated doors in modern times.  

Houfeng Didong Yi: The earthquake detector in ancient times 

2,000 years ago, the Houfeng Didong Yi, a seismoscope was discovered in ancient China. It was the first effective earthquake detector discovered in history. It was invented by Zhang Heng who was an engineer, scholar, astronomer, artist, and scientist.

The Houfeng Didong Yi could detect earthquakes as far away as hundreds of kilometers. The device was shaped in the form of a jar and consisted of eight tube-like projections on the exterior, with eight corresponding toad-shaped projections at the base. Each of these toad-shaped projections represented a direction taken by the seismic wave while traveling. To demonstrate the direction of the earthquake, the Houfeng Didong would drop a ball in a toad according to the direction of the seismic wave.

Aeolipile: The ancient steam turbine

The Aeolipile was invented by the engineer and mathematician, Heron of Alexandria. This technology functioned like a steam turbine in ancient times. It was a hollow sphere that was mounted in a manner that enabled it to turn on tubes that provided the sphere with steam from a cauldron. The steam would then evaporate from hollow tubes resulting in the device revolving at high speeds.

Although it was an efficient device, it did not progress any further from being a novelty device. It was difficult to obtain fuel for keeping the device operational over long periods of time.

Zimbabwe’s digital economy

Zimbabwe’s digital economy

It is no news to anyone that Zimbabwe has a money problem. From the late 90s until 2009, the Zimbabwe dollar reached a state of hyperinflation, which reached its peak in 2008 at 11,200,000%. This episode became infamous for the printing of notes with huge numerical values, such as the 100 trillion Zimbabwe dollars note.

In order to try to solve this problem, in 2009 the Zimbabwe government abolished the Zimbabwe dollar and announced that multiple foreign currencies would take its place in the country, including the US dollar, the British pound, the euro, the South African rand, and the Botswana pula.

However, the crisis had taken its toll, and loss of faith in financial institutions led people to hoard cash instead of storing their money in bank accounts, and the economic crisis strongly damaged their economy, leading to more imports than exports, along with other problems. Read more articles: Cloud Engineer

As a result, it didn’t take long for cash to dry out in the country. The government tried to counteract that by issuing bond notes linked to the US dollar, but it also went short. And in time, inflation started to return.

So, how could they deal with this shortage of cash? By going cashless.

Cashless?

Having no cash is different from having no money. Financial insecurity and economic instability, along with the use of the more stable US dollar, led people to hoard cash, that is, physically store their money, when having the option.

However, banks don’t really keep all of the money they store in cash format, especially when it is in a foreign currency. As people desperately tried to take their money off banks, the reserves drained. So now it is very difficult to get cash, but many people still keep money in their balance in bank accounts, and some even in foreign banks, while others store the cash they have in their houses and in safes.

As such, there isn’t a lot of cash circulating, and electronic payment became the dominant form of payment, both by bank cards and mobile phones.

The app EcoCash became the main payment service, used by individuals and companies alike. But not without its disadvantages: there are many tariffs involved in using money with it. As a result, it is common to see places using different prices for different methods of payment, with US dollar cash still being the favored option.

Does it work?

Zimbabwe is definitely not the best example of how a cashless society could work. The transition to mostly digital payments was very quick and violent, a result of a years-long economic crisis which still isn’t over. It does, however, show that it can work, and can work virtually anywhere.

Of course, Zimbabwe is not the only country that is going cashless. Sweden, for example, has been rapidly transitioning to fully cashless transactions during the last few years, and has now been relying mostly on the app Swish, the result of a cooperation between major Swedish banks and the Central Bank of Sweden. A much more orderly transition.

In Sweden, however, what is preventing them from becoming fully cashless is that cash is still available. Older people, especially the ones from rural regions of the country, are more resistant to the change, especially if they aren’t used to smartphones, and as such still prefer to use cash.

Also, being a government-backed project really helps the transition process. Zimbabweans do not have a lot of choice regarding the method of payment: it is either EcoCash, which is the most requested method, bank cards, or foreign cash. And all of them have really high drawbacks: high tariffs, insecurity, and scarcity.

Even if other similar fintechs were to appear, it would have to face the lack of economic security in the country, a steep competition with EcoCash, and difficulty getting people to adopt it, considering the low confidence in this sector of the economy.

Still, it had one fortunate, surprising side-effect: using less cash means there is one less form of transmitting the COVID-19 during the pandemic. Not to say the country isn’t struggling with it, considering its economic crisis, but it could possibly have been much worse otherwise.

Digital Life in Estonia’s unique use of Technology: e-banking & i-voting

Meet Estonia’s digital life

Since the fall of the Soviet Union and Estonia’s independence, the Estonian government has been heavily investing into the digital world, in order to not only make their citizen’s lives easier and faster, but also to attract foreign investment and improve the country’s economy.

Although, from the outside, that may seem like nothing new, Estonia’s approach to it is very innovative. The country has been moving out from “pen and paper” bureaucracy into streamlined digital processes based on the internet, through a governmental program called e-Estonia. At the time of this writing, the official e-Estonia website states that 99% of state services are now online.

Let’s see what this is all about.

Digital bureaucracy

One of the main advantages that that initiative gives to citizens is the ability to deal with governmental and private bureaucratic procedures much more rapidly and without the need to leave your home. Everything can be done using the internet.

This began in 1996 with the creation of the first e-Banking system, which is now much more widespread worldwide thanks to financial technology startups, which was then followed by systems for the government decision process (e-Cabinet meeting), tax declarations (e-Tax board), mobile parking payment (m-Parking), digitization of healthcare systems and medical history (e-Health and e-Prescription), ID card (e-ID), among many others. Everything is working everyday, 24h per day.

As a result, you get an extremely fast and transparent system which centralizes all information about you, and you can both check it out or allow companies to access it with just a few clicks. All of this powered by e-Estonia’s x-Road (a distributed data system) and blockchain technology (for cybersecurity).

i-Voting

One of the most revolutionary services they provide is probably i-Voting. The i-voting service gives Estonian citizens the ability to cast votes in Estonian parliamentary elections from anywhere in the world. The person just has to be an Estonian citizen. The Estonian government states that votes have already been cast from over 110 countries, for both national and local parliamentary elections (as they have a parliamentary government). 

This doesn’t mean that they have scrapped in-person voting. It is still available and many people use it. They estimate that only about 44% of citizens use i-Voting. But the voting process is anonymous in both cases.

The i-Voting system also comes with some advantages. Other than being able to vote anywhere you are, you are also free to change your mind at any time you want. If you decided that you no longer want to vote for the candidate you voted, you can just vote again, and the system will overwrite your previous vote, as long as you do so within the election period.

Although the system still isn’t used to institute some form of direct democracy, it already showcases the power of technology to aid governments and to make everything quicker and easier.

e-Residency

However, the most revolutionary technology implemented by e-Estonia is definitely e-Residency. If you don’t have citizenship in a European country and want your business to enter the European market, then this is the solution for you. Read more articles: Cloud Engineer

The e-Residency service is a government-backed solution that allows any person from outside of Estonia to become a “digital citizen” of Estonia, that is, an e-Resident. Although this isn’t exactly full citizenship (meaning it doesn’t come with political rights and such), it gives you the right to start and run a company in Estonia, meaning it gives you access to the entire European market.

Even better, e-Residency is fully integrated with other services from e-Estonia and follows the same philosophy: you can do everything online. You can run your entire business, open bank accounts, pay taxes and more without ever setting foot on Europe, while also enjoying an entire ecosystem of people and companies which developed around this niche.

Although Estonia is the first country to take this huge step towards being 100% digital, we can expect that, following their success, other countries will probably follow their steps in the near future, allowing our lives to become much quicker, efficient and interconnected.