The CPU, or Central Processing Unit, principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units. In modern computers/ mobiles, the CPU is contained on an integrated circuit chip called a microprocessor.

The control unit of the central processing unit regulates and integrates the operations of the computer. It selects and retrieves instructions from the main memory in proper sequence and interprets them so as to activate the other functional elements of the system at the appropriate moment to perform their respective operations. All input data are transferred via the main memory to the arithmetic-logic unit for processing, which involves the four basic arithmetic functions (i.e., addition, subtraction, multiplication, and division) and certain logic operations such as the comparing of data and the selection of the desired problem-solving procedure or a viable alternative based on predetermined decision criteria.

cloud computing Tech. Trends

How COVID-19 has pushed companies over the technology tipping point—and transformed business forever

In just a few months’ time, the COVID-19 crisis has brought about years of change in the way companies in all sectors and regions do business. According to a new McKinsey Global Survey of executives, their companies have accelerated the digitization of their customer and supply-chain interactions and of their internal operations by three to four years. And the share of digital or digitally enabled products in their portfolios has accelerated by a shocking seven years. 

Nearly all respondents say that their companies have stood up at least temporary solutions to meet many of the new demands on them, and much more quickly than they had thought possible before the crisis. What’s more, respondents expect most of these changes to be long lasting and are already making the kinds of investments that all but ensure they will stick.

cloud computing security

Rushed digital transformation is creating security risks

The pandemic provided the kick in the pants that many enterprises needed to finally get long-gestating digital transformation efforts underway. But for many organizations, such transformations turned into rush jobs, with many digital transformation projects being hatched far earlier than expected.

While some of these transformations came out in one piece, many weren’t so fortunate, carrying with them a virulent case of cybersecurity vulnerabilities. These vulnerabilities have in turn led directly to a surprising number of breaches.

computing Printers Tech. Trends


A semiconductor is a computer chip that serves as the brain of anything that’s computerized or uses radio waves. It handles complex thinking such as arithmetic and data storage that is integral to cell phones, tablets, kitchen gadgets, laptops, video game consoles and automobiles.

In vehicles, dozens of individual semiconductor chips are used to control everything from engine temperature to alert drivers of the need for an oil change. The types of chips produced by semiconductor companies can be categorized in two ways(As per the integrated circuits or functionality of the chip).

Usually, chips are categorized in terms of their functionality. However, they are sometimes divided into types according to the integrated circuits (ICs) used.

cloud computing

Six predictions for the future of the edge

The edge is going to change the way we interact with each other and the world.

Thanks to edge computing, the world is about to look much different.

Within a decade, the edge will boast more computing power—and produce far more data—than the cloud does today, says Lin Nease, HPE Fellow and chief technologist for IoT.

The edge is where the Internet of Things, artificial intelligence, and ultrafast 5G networks are converging. And that will change our lives dramatically over the next few years.

Here are six major trends we can expect to see over the coming decade.


Patch Management

Patch management is the process for identifying, acquiring, installing, and verifying patches for products and systems. Patches correct security and functionality problems in software and firmware. From a security perspective, patches are most often of interest because they are mitigating software flaw vulnerabilities; applying patches to eliminate these vulnerabilities significantly reduces the opportunities for exploitation. Also, patches are usually the most effective way to mitigate software flaw vulnerabilities, and are often the only fully effective solution.


Progressive Web Application

Platform-specific applications specific to android/apple/ windows etc, are known for being incredibly rich and reliable. They’re ever-present, on home screens, docks, and taskbars. They work regardless of network connection. They launch in their own standalone experience. They can read and write files from the local file system, access hardware connected via USB, serial or Bluetooth, and even interact with data stored on your device, like contacts and calendar events. In these applications, you can do things like take pictures, see playing songs listed on the home screen, or control song playback while in another app.

computing security

Recovery Point Objective(RPO) and Recovery Time Objective(RTO)

Disaster Recovery is one of the most important terminologies among system administrators as organisations continue to increasingly rely on technology to run their line of websites/apps, emails, various business applications and day to day operations. Two parameters that play an important role in disaster recovery are Recovery Point Objective (RPO) and Recovery Time Objective (RTO). Both this parameter form the basis of an organisation`s disaster recovery and business continuity plans including how the system administrators plan the backup processes, frequency of the backup, the recovery time limits and recovery procedures. Though similar these parameters are largely different. There are other metrics too such as a recovery point actual or recovery point objective which can only be determined during a real-life scenario or a DR drill.


Azure Cloud

What is Cloud?

Cloud computing is an information technology (IT) paradigm that enables ubiquitous access to shared pools of configurable system resources and higher-level services that can be rapidly provisioned with minimal management effort, often over the Internet. Cloud computing relies on sharing of resources to achieve coherence and economies of scale, similar to a public utility.