Data Storage

Cloud Computing is Out of This World: Microsoft and SpaceX Launch Azure Space

0 Comments

Microsoft has teamed up with Elon Musk’s Space X in taking the next giant leap for mankind. This time it’s in cloud computing, in space. The tech giant is collaborating with Space X to launch Azure Space. The cloud computing platform Azure intends to offer mobile cloud computing data centers that can deploy anywhere around the world. 

 

In order to make this endeavor a success, Microsoft called upon SpaceX’s Starlink internet satellites and extended its agreement with satellite company SES for Azure Space. Microsoft put together a team of respected space industry veterans to work in conjunction with their own product engineers to build cloud capabilities that meet the basic demands of space. By partnering with leaders in the space community, Microsoft seeks to expand the utility of our Azure capabilities with worldwide satellite connectivity, release cloud computing in more settings, and ultimately enable their customers to achieve more.

 

Learn more about the Secrets of Radiation Hardened IT Equipment in Space by reading our blog post. 

Satellite networking available anywhere, anytime

With the rapid increase in data, the importance of dependable ways to connect has also grown. That being said, Microsoft boasts a global network of over 160,000 miles of subsea, terrestrial, and metro optical fiber, helping billions of people connect all around the world.  However, many users also work in remote, rugged environments, finding it nearly impossible to keep up with the need for access to data at fast speeds. This is what the Azure Space ecosystem will provide. 

Microsoft’s plan with Azure Space to bring comprehensive satellite connectivity solutions to meet the needs of our customers by providing a multi-orbit, multi-band, multi-vendor, cloud-enabled capability. Robust satellite communications, combined with Azure’s delivery of high-powered computing, machine learning, and data analytics creates massive amounts of growth for organizations. Microsoft’s partnership with satellite communication experts allows them to bring these resources to customers faster.

Self-contained datacenters are the future of infrastructures 

 

Many businesses are reaping the benefits of cloud computing, with products like Azure Stack. In fact, Microsoft has already made huge strides in navigating their datacenters in extreme environments. For example, their Project Natick underwater datacenter that sat at the bottom of the ocean and proved to be extremely successful in lowering energy usage. By developing insights and gathering feedback from their users with the harshest requirements, the Microsoft Azure Modular Datacenter (MDC) is able to take computing to the next level.

To learn more about Project Natick underwater datacenter, check out our blog post: The Underwater Data Center of the Future

Microsoft created Azure Space to support fast and secure cloud computing available in any environment. Customers will have the luxury of having Azure Space on their terms wherever they need it, in a self-contained unit – in space. The satellite data center gives organizations the ability to access a complete datacenter in remote locations, or to enhance existing infrastructure with a transportable solution. No natural disaster on Earth can harm your data center when its floating hundreds of miles above the ground level at all times.

Primed for space with the power of Azure

 

With the modern rocketry advancements of Space X, human missions and satellite launches have become more frequent. The accessibility and capabilities of these progressions are allowing tech companies like Microsoft develop more reliable, repeatable digital technologies to help the space community launch faster and with more confidence in the success of their missions. 

 

Of these recent innovations between tech and space, Azure Orbital Emulator is at the top. Both commercial and government funded space exploration groups are building thousands of interlocked satellites which require detailed design and AI-driven procedures, to guarantee prime networking connectivity and coverage while in orbit.

 

Azure Orbital Emulator is an imitation environment that directs massive satellite simulations with software and hardware. This allows developers to evaluate and train complex AI algorithms and the satellite networking system before ever launching any satellites into orbit. Azure can mimic a complete satellite network including intricate, real-time scene generation. Azure Orbital Emulator is now being used by customers in Microsoft’s Azure Government environment.

The future of space and tech together

 

The partnership between space and tech helps tackle the some of the toughest technology challenges faced in the cosmos. Users currently struggle with the massive amount of data produced from satellites, bringing cloud services and bandwidth to remote locations, and designing multifaceted space systems. The network of partners brings this data to ground faster, making connections that weren’t possible before. The future of space and tech working together is exciting; using the power of cloud and space technology to help businesses re-imagine solutions to some of the most stimulating problems.

Data Backup and Archived Data? What’s the Difference…

0 Comments

What’s the difference between a data backup and archived data?

For those new to the data hoarding world, they may not see a clear distinction between the different data storage methods currently used. The two main methods of storing data are data backup and data archive. Actually, there’s a huge difference between the both of them as well. 

For those that are oblivious of the difference between the two, they may potentially be wasting costly office space and money. Not to mention putting their data at risk. 

What is a data backup?

Data backup is defined as a copy of the initial data that is used to replace the original in the event it is lost, stolen, or destroyed. Many users may view data backup as a safety net among other things. 

A great example of data backup technology is disk storage and LTO technology. Both forms of data storage are used as backup platforms to store a previous copy of a dataset that can replace unusable or unavailable originals.

Is LTO technology right for your long-term data storage strategy?

What is archived data?

Archived data differs consists of a collection of chronological data that is seldomly retrieved. However, a data archive usually contains important data that is intended to be kept long-term for various reasons including reference and analytical applications. Archived data can also be used to free up primary disk storage space from data that is no longer actively used but must be retained.

An organization’s data backup is a copy of the original data, while its data archive is the original that was removed from its original location and stored at another site for long-term custody.

There have been a number of instances where organizations try to design their backup software to satisfy both backup and archive roles. The downside to this approach is that it can be very time consuming trying to locate a single file needed for long-term archiving from an entire backup server.

Keeping a complete backup job is not a very cost-effective way to use data storage, as backups only copy the data, leaving the original file in its place. Realistically, it wouldn’t free up any space at all.

The act of backing up data is just another way of making a copy of your already existing data, essentially using up more storage space. When the original data is moved to the data archive, storage space is increased; allowing data management to become much more efficient. 

A key difference maker between data archiving software and data backup software is the labeling and search capabilities of an accessible archive. Metadata related to archived objects is stored in a database and can be searched based on user-supplied criteria.

Enhancing both data backups and archived data with LTO technology

 

Backups and archives stored on LTO tape provide the industry’s most cost-effective, high-performance and high capacity data storage. Information stored on tape is largely offline, adding a layer of protection against ransomware and cyber-attacks. This layer of protection is known as an “airgap”. By utilizing an airgap in a data storage strategy, tapes can easily copied and used for offsite storage, providing a supplementary layer of protection against unforeseen disasters and cyber-attacks. 

Learn more about airgap here.

Additional benefits of using LTO tape for archiving data are:

  • Archives on LTO tape can be stored for up to 30 years or more.
  • LTO tape supports Linear Tape File System (LTFS) for native, fast access.
  • Encryption and WORM support are provided with LTO technology, increasing an already high-secure storage solution.

The Secret Behind Radiation Hardened IT Equipment in Space

0 Comments

The Secret Behind Radiation Hardened IT Equipment in Space

From 1961- 1975, during the worldwide space race and when the United States was making history with successful moon landings, technology at the time was booming. However, the Apollo 11 computer had a processor which ran at 0.043 MHz; meaning the iPhone in your pocket has over 100,000 times the processing power of the computer that landed man on the moon! More than 50 years later, its no secret that technology has developed into something we’d never dreamed possible. So, you’d think we’d at least be using updated systems in space today. Right?! Wrong. The computer hardware on board spacecraft computers is far from the newest and best around. 

Until the recent Space X Flacon 9 rocket, space travel was conducted with outdated processors. Even the International Space Station (ISS) is operating with using two sets of three command and control multiplexer demultiplexer computers from 1988. Even the chips that made up the original Sony PlayStation in 1994 are faster! Well luckily for all future astronauts and space cowboys alike, the Space X Falcon 9 carrying a Dragon spacecraft sent to the ISS was the first commercial off-the-shelf (COTS) high-performance computer to orbit the earth. It just so happens to be among the first supercomputers in space.

What is Radiation Hardening and Why is Necessary?

Radiation hardened electronics can simply be defined as electronic components that have been designed and tested to provide some level of protection against penetrating radiation. If not protected, radiation can cause the computer components to malfunction, damage circuitry or cause the electronic device to completely shut down. Radiation hardening is essential when the electronics are used in environments where they will be exposed to high energy ionizing or space radiation.

There are three types of space radiation concerning electronic computer components used in space: galactic cosmic rays (GCRs), high energy solar radiation, and radiation belts. Galactic Cosmic Rays (GCRs) are electrons, protons or neutrons that originate outside of our solar system. High Energy Solar Radiation are emissions from the sun due to solar flares or explosions of stored magnetic energy. Radiation Belts contain trapped electrons and ions of varying energy levels. GCRs and solar radiation routinely reach the earth; therefore, they are present at all of the earth’s atmospheric levels. 

For manned spaceships and satellite, continuous and reliable operation depends on being able to withstand space radiation. If you don’t already know the answer to the question, then you’re probably asking yourself why do we use spacecraft with such outdated processors? Well, by NASA’s standards and the laws of physics, not just any computer can go into space. Computer components must be radiation hardened, especially the CPUs. Otherwise, they tend to fail due to the effects of ionizing radiation. 

There is more modern hardware in space like the laptops used on the ISS. But those laptops are not high-performance computers. They’re just ordinary laptops that are expected to fail. Actually, there are more than a hundred laptops on the ISS, and most are obsolete. In order to perform serious data mining, we want high-performance computing. Afterall these are the reasons we’re doing experiments on the space station. 

The typical way to radiation-harden a computer that will be used in space is to add redundancy to its circuits or use insulating substrates instead of the usual semiconductor wafers on chips. That’s not only very costly but laborious as well. Scientists believe that simply slowing down a system in adverse conditions can avoid glitches and keep the computer running.

The end goal is to develop a functional supercomputer for operation in space without spending years hardening it. By using off-the-shelf servers and custom-built software, scientists are trying to harden a computer using software by throttling its speed when there’s a solar flare or other radiation hazard. If possible, astronauts will have the latest devices available, increasing their onboard capabilities.

The Effects of Space Radiation

There are a number of ways that computer components designers can radiation-harden their devices. One of the most common is to harden for total-ionizing-dose radiation – or the amount of radiation the device is expected to withstand for its entire life before problems occur. A typical requirement is for 100 kilorads of total-dose radiation hardness. The advancement of today’s advanced electrical components is changing the total-dose picture. Specifically, the shrinking size of circuits on today’s most modern chips is decreasing their exposure to total-dose radiation.

This trend is a double-edge sword because the steady shrinking of chip geometries also makes these devices even more vulnerable to other kinds of radiation effects, namely single-event upset (SEU) and single-event latchup (SEL). If not protected, radiation can cause the computer components to malfunction, damage circuitry or cause the electronic device to completely shut down.