Maker Pro

British Startup Blueshift Memory Supercharges Data Operations with New Memory Technology

August 16, 2019 by Tyler Charboneau

Though steady CPU enhancements have boosted computational data manipulation, embedded memory is now the largest remaining efficiency bottleneck. Blueshift Memory is actively working to shrink this gap—a development that will significantly benefit genetic research, AI programming, climate modelling, and more.

As the company explains, “A processor is ... worthless if it’s not being fed information fast enough”. That performance disparity, commonly dubbed ‘data tailback’, occurs when a CPU’s performance potential is hampered by RAM limitations. This is becoming a growing engineering concern and an evolving business problem that demands a solution.

Blueshift’s small team is tackling these issues, such as by designing complementary technologies for existing memory systems.


Blueshift Hello Tomorrow product demonstration

Blueshift's chief technology officer Peter Marosan, Blueshift's chief technology officer, leads a product demonstration at Hello Tomorrow 2019. Image courtesy of Blueshift. 


The Technology Behind It All

Today’s RAM comes in many varieties, and Blueshift’s memory module is compatible with SRAM, DRAM, MRAM, and non-volatile cells. Their chip can remarkably speed up related memory processes, improving performance.

Accordingly, Blueshift states that their module is “independent from the applied memory cell technology”. This companion architecture will unlock new possibilities in existing systems, while acting as a foundational component for new computing technology.

The Independent quantifies these gains, claiming that data operations can occur up to 1,000 times faster thanks to Blueshift’s module. The technology shines alongside search engines, where large databases are rapidly combed in response to queries.

Pairing powerful CPUs with capable memory technology can dramatically improve this fetching. That’s promising news for both development teams and end users. Furthermore, computers may complete intensive operations in only fractions of the time.

Data stores are continually growing, and associated fragments occupy dynamic positions within RAM. Blueshift’s module can assign crucial data to static positions. From here, access becomes faster and more predictable.

Dynamic, big data applications no longer have to be clunky behemoths. Swifter memory-handling enables real-time processing by hastening data calculations. Developers of deep-learning applications will enjoy harnessing this performance. In fine-tuning such hardware compatibility, engineers will see an exciting engineering challenge, as these chips will have unique resource demands (under normal circumstances, RAM is quite power-hungry on its own).


Blueshift memory module concept graphic. Image courtesy of Blueshift.


Current and Future Applications

Blueshift is targeting demanding fields with its emerging memory technology. Accordingly, these focus areas utilise systems vastly more capable than consumer devices and some enterprise networks. The company’s memory module lends itself well to tasks in select disciplines as a result.


AI and Modelling Galore

We’ve touched briefly on the merits of Blueshift’s memory in the AI space: better memory performance equates to highly-improved processing, as rapid data access fuels deep learning. Blueshift’s technology should streamline data retrieval and decoding, and deep learning applications thrive on such concurrent calculations and retrieval processes. Blueshift believes its new architecture will be increasingly integral in this area.

DNA and climate research both rely on complex modelling. Computers generate these models by analysing a plethora of stored data and translating it into actionable visuals. These visuals evolve as databases grow, and memory plays a key role in accessing pertinent information. That unfettered handoff to the CPU makes intensive work more feasible. Adaptive visualisations of future events—like genetic mutations and long-term climate change—rely on performant memory technology.

Few real-world testing grounds are better than today’s major cities, which house millions of people. These residents are mobile, navigating crowded streets every minute of every day. Metropolitan areas are often traffic havens due to subpar urban planning, and infrastructure and urban layouts are clamouring for better optimisation as a result.

Test scenarios draw from the relevant accrued data, based on population and travel patterns; but while, understandably, these are highly complex, Blueshift’s modules are capable of simultaneously digesting numerous moving parts. Traffic flow simulations allow officials to observe the impacts of even minute changes.


Virtual reality (VR) concept: a silhouette of a man in a VR-based urban simulation. Image courtesy of Shutterstock.


The Future of AR and VR

Blueshift acknowledges that latency is a driving force behind proper immersion. The user experience is extremely fickle with AR and VR devices, as even minute interruptions can be jarring. Accordingly, the company’s memory supports peak application performance, especially at high resolutions—with a module optimised for dynamic scenarios.

Blueshift says its architecture supports blazing-fast image handling, so to help video render flawlessly in mere milliseconds.


Networking and Data Consumption

Engineering teams from internet service providers and networking companies may soon benefit immensely from new memory technology. We’ve often parroted data handling’s importance in the past, but it’s truly the common denominator between applications.

Who consumes more data than internet users on a daily basis? This data is used in a staggering amount of ways under a variety of conditions. Now more than ever, customers and enterprise teams require lightning-fast access to information. Consequently, networking equipment represents a promising new testbed for Blueshift’s modules.


Where Does Blueshift Go from Here?

Blueshift has been seeking additional funding to support the development of its chip. This has been an expensive endeavour, and more help is needed despite support from partners like the University of Cambridge.

Additionally, manufacturing the chip at scale is expected to be expensive, should Blueshift move past the prototyping stage. These hurdles are notable, especially if production yields are underwhelming. Much of Blueshift’s future success will rely upon the availability of raw materials, which is unpredictable in the electronics space.

If the company’s claims are true, many fields will benefit from heightened memory performance. This bottleneck is significant, but researchers should be able to mitigate it as development progresses. Ultimately, there’s plenty to gain, and Blueshift’s memory technology may certainly facilitate our next technological leap.

Related Content


You May Also Like