Advertisements

Xerox Awarded Patent for DLT Based Electronic Document Verification System – Aisshwarya Tiwari

1.jpg

According to a patent filing published November 13, 2018, by the U.S. Patent and Trademark Office (USPTO), American print and digital documents solutions company Xerox has won a patent for a DLT-based auditing system that tracks revisions to electronic documents. Increased Authenticity of Electronic Documents Per the filing, the patent was initially filed on February 16, 2016. The patent explains a blockchain technology-based system for efficient and secure recording of changes made to electronic documents…………..

Read more: https://btcmanager.com/xerox-awarded-patent-dlt-based-electronic-document-verification-system/

 

 

 

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you

Advertisements

Why Apple Is Finally Ditching Its Proprietary Lightning Connector For USB-C On All iPhones, iPads – Jean Baptiste Su

1.jpg

At the company’s “More in the Making” event on Tuesday, Apple’s vice-president of hardware engineering John Ternus revealed that the new iPad Pro will have a USB-C port – already present on the latest MacBooks – instead of the company’s proprietary Lightning connector. “Because a high performance computer deserves a high performance connector. And so in these new iPad Pros, we’re moving to USB-C,” said Ternus. “This brings a whole new set of capabilities to the iPad Pro like connecting to accessories that change how you use your iPad, cameras, musical instruments, or even docks. Or connecting to high-resolution external displays up to 5K………

Read more: https://www.forbes.com/sites/jeanbaptiste/2018/10/31/why-apple-is-finally-ditching-its-proprietary-lightning-connector-for-usb-c-on-all-iphones-ipads/#409f327a434c

 

 

 

 

 

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you

AI And The Third Wave Of Silicon Processors

1.jpg

The semiconductor industry is currently caught in the middle of what I call the third great wave of silicon development for processing data. This time, the surge in investment is driven by the rising hype and promising future of artificial intelligence, which relies on machine learning techniques referred to as deep learning.

As a veteran with over 30 years in the chip business, I have seen this kind of cycle play out twice before, but the amount of money being plowed into the deep learning space today is far beyond the amount invested during the other two cycles combined.

The first great wave of silicon processors began with the invention of the microprocessor itself in the early 70s. There are several claimants to the title of the first microprocessor, but by the early 1980s, it was clear that microprocessors were going to be a big business, and almost every major semiconductor company (Intel, TI, Motorola, IBM, National Semiconductor) had jumped into the race, along with a number of hot startups.

These startups (Zilog, MIPS, Sun Microsystems, SPARC, Inmos Transputer) took the new invention in new directions. And while Intel clearly dominated the market with its PC-driven volumes, many players continued to invest heavily well into the 90s.

As the microprocessor wars settled into an Intel-dominated détente (with periodic flare-ups from companies such as IBM, AMD, Motorola, HP and DEC), a new focus for the energy of many of the experienced processor designers looking for a new challenge emerged: 3-D graphics.

The highly visible success of Silicon Graphics, Inc. showed that there was a market for beautifully rendered images on computers. The PC standard evolved to enable the addition of graphics accelerator cards by the early 90s, and when SGI released the OpenGL standard in 1992, a market for independently designed graphics processing units (GPUs) was enabled.

Startups such as Nvidia, Rendition, Raycer Graphics, ArtX and 3dfx took their shots at the business. At the end of the decade, ATI bought ArtX, and the survivors of this second wave of silicon processor development were set. While RISC-based architectures like ARM, MIPS, PowerPC and SPARC persisted (and in ARM’s case, flourished), the action in microprocessors never got back to that of the late 80s and early 90s.

Image result for AI And The Third Wave Of Silicon Processors

Competition between Nvidia and ATI (eventually acquired by AMD) drove rapid advances in GPUs, but the barrier to entry for competitors was high enough to scare off most new entrants.

In 2006, Geoffrey Hinton published a paper that described how a long-known technology referred to as neural networks could be improved by adding more layers to the networks.

This discovery changed machine learning into deep learning. In 2009, Andrew Ng, a researcher at Stanford University, published a paper showing how the computing power of GPUs could be used to dramatically accelerate the mathematical calculations required by convolutional neural networks (CNNs).

These discoveries — along with work by people like Yann LeCun and Yoshua Bengio, among many others — put in place the elements required to accelerate the development of deep learning systems: large labeled datasets, high-performance computing, new deep learning algorithms and the infrastructure of the internet to enable large-scale work and sharing of results around the world.

The final ingredient required to launch a thousand (or at least several hundred) businesses was money, which soon started to flow in abundance with venture capital funding for AI companies almost doubling every year from 2012. In parallel, large companies — established semiconductor heavyweights like Intel and Qualcomm and computing companies like Google, Microsoft, Amazon and Baidu — started to invest heavily, both internally and through acquisition.

Over the past couple of years, we have seen the rapid buildup of the third wave of silicon processor development, which has primarily targeted deep learning. A significant difference between this wave of silicon processor development and the first two waves is that the new AI or deep learning processors rarely communicate directly with user software or human interfaces — instead, these processors operate on data.

Given this relative isolation, AI processors are uniquely able to explore radically different and new implementation alternatives that are more difficult to leverage for processors that are constrained by software or GUI compatibility. There are AI processors being built in almost every imaginable way, from building on traditional digital circuits to relying on analog circuits (Mythic, Syntient) to derivatives of existing digital signal processing designs (Cadence, CEVA) and special-purpose optimized circuits for deep learning computations (Intel Nervana, Google TPU, Graphcore).

And one popular chip architecture has been revived by a technology from the 30-year-old Inmos Transputer: systolic processing (Wave Computing, TPU), proving that everything does indeed come back in fashion one day. Think of systolic processing as the bell bottoms of the silicon processor business.

Image result for AI And The Third Wave Of Silicon Processors

There are even companies such as Lightmatter looking to use light itself, a concept known as photonic processing, to implement AI chips. The possibilities for fantastic improvements in performance and energy consumption are mind-boggling — if we can get light-based processing to work.

This massive investment in deep learning chips is chasing what looks to be a vast new market. Deep learning will likely be a new, pervasive, “horizontal” technology, one that is used in almost every business and in almost every technology product. There are deep learning processors in some of our smartphones today, and soon they will be in even lower-power wearables like medical devices and headphones.

Deep learning chips will coexist with industry-standard servers in almost every data center, accelerating new AI algorithms every day. Deep learning will be at the core of the new superchips that will enable truly autonomous driving vehicles in the not-too-distant future. And, on top of all of this silicon, countless software offerings will compete to establish themselves as the new Microsoft, Google or Baidu of the deep learning future.

If everyone who reads our articles, who likes it, helps fund it, our future would be much more secure. For as little as $5, you can donate us – and it only takes a minute. Thank you.

By: Ty Garibay

%d bloggers like this:
Skip to toolbar