Home » Data and multicloud, interview with Andrea Zinno of Denodo

Data and multicloud, interview with Andrea Zinno of Denodo

by admin
Data and multicloud, interview with Andrea Zinno of Denodo

Andrea Zinno, Sales Director & Data Evangelist at Denodo, tells us about the company’s approach to customers when it comes to hybrid data, multicloud and migration.

– When it comes to migration, many professional realities are in difficulty. What strategies and what support do you offer as Denodo?

The main theme of any migration is that it often involves a change of technological infrastructure: in the case of the cloud, migration does not imply a simple “geographical” movement of data, but also involves the adoption of applications and solutions offered by the cloud provider itself .
A typical example is that of on-premise data warehouses: when you migrate to the cloud of leading providers (such as Amazon, Google, Microsoft), which have their own data warehouse solutions, you also have to face a necessary technological change.

Looking at the world of data, generally the problem that companies have to face is how to manage the migration causing the least possible impact for users who must continue to use precisely that data: in fact, in an ever faster business context, it is not conceivable stop a “only” service to complete the migration. It therefore becomes essential to be able to count on solutions that make the migration independent from the so-called “data consumers”.

Denodo offers just that: as a logical data management solution, it creates a layer of decoupling between the data consumer and the actual data, ensuring that the underlying migration is masked and does not impact users. When the customer already has a data infrastructure implemented, users tend not even to notice that a migration is in progress, since it is the job of the Denodo Platform to carry out the transition, without precluding complete access to the data that users need. Denodo’s role is therefore to create an intermediate level of decoupling, which can absorb all the complexity of any type of migration (in cloud or on premise).

hybrid cloud SaaS infrastructure services

Data and information in the multicloud era

– Hybrid data analysis and multicloud management represent one of the most relevant challenges of recent years, what do you propose to implement efficient management?

The complexity managed by the Denodo Platform can be of various nature, from that relating to the technological infrastructure to that due to the fact that data is increasingly faster and more diversified, up to the typical complexity of a hybrid multicloud world with architectures that extend beyond outside company boundaries to multiple cloud providers.

See also  My Time at Sandstone: Meet the Key NPC Characters of the Latest Sequel

In this context, the concept of logical data integration is fundamental, as a logical layer does not oblige to collect data beforehand in a single point only to be able to make them available. On the contrary, it is based on an idea of ​​logical-physical separation for which there is the need to centralize only the representation of the data (in fact this serves the users), leaving the actual data on the source.

Data is multicloud

In a hybrid world like the current one, with a multiplicity of distributed data and countless technological variants, this level of decoupling is able to mask and manage the great challenge of complexity.

A multicloud strategy implies the possibility of changing cloud-providers on an ongoing basis and therefore a modern method of data integration based both on the decoupling between who uses the data and where they reside, and on a logical approach that minimizes data movement is needed and that – in the event of migration – you avoid having to give up previously copied data to replace them with others just because they are represented differently.

– The cloud approach allows you to better manage peaks and sudden requests, how?

In this case we are not referring only to data, but to the very essence of the cloud: this is based on a pull of large and never 100% used resources, which represent the reserve of computational power needed to manage peaks.
Considering that a peak on all customers globally is unlikely to occur at the same time, the cloud provider has a much easier time responding to the needs of an individual than it could do with a totally on-premise infrastructure (in this case, in fact , it would be necessary to have an infrastructure that would risk remaining idle 90% of the time, just to have the guarantee of being able to manage any peaks autonomously).

See also  iPad on sale: All models & where to find the best deals

An idea of ​​information technology as a commodity has therefore spread which – despite having an impact from the point of view of ongoing costs – allows us to meet sudden requests without running the risk of blocking systems and without burdening companies with otherwise unmanageable infrastructural expenses .

hybrid cloud cloudera data platform data transfer

Data and multicloud, the potential for business

– Cloud and on-premise: how to measure the impact on sustainability?

Today, great attention is paid to the issue of sustainability and to the calculation of the carbon footprint, which assumes even greater importance in the case of an energy-intensive sector such as information technology.

Digital Transformation processes bring with them the need for ever greater computing power (think, for example, of the resources needed to enable machine learning or artificial intelligence): it is important to become aware that living in a digitized world means having an impact also from an environmental point of view.

However, there are architectures that make it possible to have a minimal impact, by pooling – as the cloud providers do – the computational power available to customers. Here comes the concept of peak again: traditional management would imply that each company should equip itself with the resources necessary to cope with possible increases in demand. This would mean having to field very high energy levels, much higher than those of a cloud provider, which can afford to guarantee the same resilience but with lower overall resources.

We can therefore state that if greater digitalisation corresponds to greater consumption, a cloud-based model makes it possible to optimize energy resources.

It should also be considered that the data centers of large cloud providers are typically more technologically advanced in terms of consumption and energy efficiency than the data centers of individual companies, which cannot afford to renew the technological park with the same frequency.

See also  Capcom Looks To Break Sales Records This Fiscal Year - Gamereactor

– Cybersecurity plays an increasingly important role in company management mechanisms and in company development plans. What is your point of view?

Information security is divided into two levels: on the one hand, the so-called “perimeter” security (which protects the company from unauthorized access), on the other, data security.

It is on this second aspect, very close to the concept of data governance, that Denodo intervenes and plays a role of primary importance. In fact, within each company, it is necessary to implement different policies based on the company levels and roles, so as to ensure that those who access the data actually have all the necessary permissions to do so. This is where that perfect balance is played out between making data available to all those who need it (according to a logic of data democracy) and at the same time ensuring that each user can only consume the data for which he is authorized to to do it.

In this dynamic, the aforementioned level of logical decoupling represents an undoubted advantage: by eliminating the need to copy data just to make them integrable, access points are also reduced and security levels are indirectly increased (every time a This creates a new potential breach for cybercriminals and increases risk levels, because making sure each data collection point implements the same layers as the source systems can be complex).

Especially in the world of the PA, giving those responsible for the data the opportunity to define rules is of critical importance. This is an aspect of security that is not usually considered, but which is essential to ensure that the use of data is consistent with the specifics of each organization.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy