What Retailers Should Consider Before Adding Data-Intensive...

What Retailers Should Consider Before Adding Data-Intensive Technologies

By: Retail CIO Outlook | Monday, October 12, 2020

According to a report, retailers are adding more data-intensive technologies while neglecting the performance of the IT system. Is it healthy for their organizations?

FREMONT, CA: According to 2019 Retail Technology Report, the major problem with the information technology (IT) in the retail space is investing more in technology, which is data-intensive without addressing IT system performance. This can lead to severe issues, including systems slowdowns and crashes.

In retail, either of those situations can lead to losing customers as well as revenue. This problem is being covered by the increased investments in technology to allow omnichannel operations, like inventory management systems and order processing.

Doubtlessly, these investments will assist in satisfying the requirement for better customer relationships and sell-through. However, they will fail miserably if the poor system performance chases away customers and prospects. In other words, managing big data is of no use unless it is high-speed data as well.

IT system emerges as the key to powering an absolute customer experience, which is vital to processing and analyzing significant volumes of data for a multitude of crucial functions:

• While average retail inventory credibility is about 65 percent, omnichannel fulfillment needs at least 96 percent credibility to offer customers an absolute experience.

• In addition to accurate inventory, omnichannel retailers are also advised to integrate data from the supply chain, credit and collections, CRM, marketing, and sensor networks.

• As retailers integrate advanced technologies such as the Internet of Things (IoT), artificial intelligence (AI), IT systems must be capable of processing and analyzing more data.

Processing and analyzing data rely on the input/output performance of the overall system, which is also known as throughput, and can easily go unnoticed. Even though the IT industry is getting advanced with the growing network, memory speeds, more bandwidth, and faster processors, the poor performance of I/O is denied any benefit.

Most retailers are aware of the fact that incorporating new technology with the present IT systems poses a significant challenge. However, they are oblivious to the information that there are cost-efficient solutions in the market as well to address application performance at the operating system, storage levels, and file systems. These cost-effective systems also hold the potential of accelerating performance from 30 percent to 50 percent or more without hardware or network update.

Retailers with conventional thinking about IT investments have some considerations like needing more computer power, purchasing more systems, requiring faster network speeds, increasing network bandwidth, and purchasing the hardware that goes with it. They also think about buying more hardware if they are falling short of storage. In this way, cost continues to increase proportionately to the demand for the three fundamentals: applications, uptime, and speed.

However, there are solutions that can help with containing IT costs. Data center infrastructure management (DCIM) software is a useful instrument for analyzing and minimizing the overall cost of IT. In fact, nearly $2 billion has been saved by U.S. Data Center Optimization Initiative since 2016.

Other solutions that do not need new hardware for enhancing the performance and extending the life of the present systems are also available.

Many giant enterprises performing data analytics need a computer system to access various and widespread databases, pulling data together through millions of I/O operations. The analytic capability of the system relies on the efficiency of those operations, which in turn rely on the efficacy of the computer’s operating ambiance.

Especially in the Windows environment, I/O performance degradation advances over time. This degradation can occur in any storage environment and can be responsible for lowering the overall throughput capacity of the system by 50 percent or more. Windows penalizes optimum performance because of server inadequacies in the data transfer to storage. This can happen in any data center, whether it is on-premises or in the cloud. However, it can get worse in a virtualized computing environment. In a virtual environment, the array of systems all sending I/O up and down the stack and to and from the storage develop small, fractured, random I/O, which leads to ‘noisy’ environment that slows down the performance of the application. If not treated, the situation gets worse over time.

Sometimes, even experienced IT experts think that the latest hardware will resolve these issues. Since data is so vital to operate organizations, retailers are allured to invest money in buying new hardware to solve this problem. While additional hardware can cover up this degradation temporarily, the system throughput can be improvised by targeted software by 30 percent to 50 percent or more. Software like this possesses the benefit of being non-disruptive, i.e., no ripping and replacing hardware. Besides, since it is added in the background, it can leverage end-users with its transparency. In this way, the software solution will be able to manage more data by eradicating overhead, increasing performance at a much lower price, expanding the life of the existing systems.

While retailers are considering omnichannel operations investments in technology, they should also neglect the requirement for optimal systems performance at the same time.

Check This Out: Top Cloud Companies

Weekly Brief

Read Also