martes, 31 de diciembre de 2013

Brindis (no tecnológico) por el 2014


Otros años siempre he puesto un mensaje más o menos gracioso a modo de brindis  para el año entrante, pero este año voy a "robaros" un minuto pues este brindis es un poco más serio (aunque no tecnológico), pero me sale de las entrañas, quiero brindar virtualmente con vosotros para que en este 2014:

- Para que todos las personas que honran su trabajo y su palabra consigan arrinconar (y si es posible encarcelar) a ese conjunto de chorizos, mangantes y mentirosos que abundan, por desgracia, tanto en la clase política, como sindical, como empresarial (no son todos, pero sí demasiados);

- Para que nos libremos, además, de los parásitos que medran alrededor de los anteriores, viviendo a nuestra costa sin pegar ni golpe dentro del propio sistema que han creado;

- Porque todos los españoles que quieren trabajar tengan opción de ello y en unas condiciones dignas;

- Porque España no pierda sus señas de identidad ni ese sentido de solidaridad que nos caracteriza, pero aprendamos a aplicarlo primero con los nuestros y porque los que lo reciben no abusen de ello y se conviertan en "vividores";

- Para que nos demos cuenta, por fin, que este país UNIDO y sin la lacra de todos los antes mencionados tiene un gran futuro (y no sólo deportivo o cultural, sino también económico y social);

- Y  brindo por vosotros, mis amigos, a los que os aprecio mucho más de lo que os imagináis y os demuestro.

domingo, 15 de diciembre de 2013

OpenStack keeps gaining momentum in Cloud market, as evidenced by Oracle support, in spite of Gartner opinion

A few weeks ago, Gartner analyst Allessandro Perilli recently says the project has a long way to go before it’s truly an enterprise-grade platform. In fact, in a blog post he says that “despite marketing efforts by vendors and favorable press, enterprise adoption remains in the very earliest stages” … The main reasons for that, in its opinion, are:
  • Lack of clarity about what OpenStack does.
  • Lack of transparency about the business model.
  • Lack of differentiation.
  • Lack of pragmatism.

OpenStack backers rebuffed such claims, and I must recognize that I’m biased because I work in a European company (Tissat, based in Spain, and with several DataCentres and one of them -Walhalla- certified as Tier IV by the Uptime Institute) that offer IaaS Services using OpenStack. But I also have to recognize that OpenStack is a solution that is continuously evolving and growing, and therefore I agree with some of the statement of Mr.  Perilli, but I disagree with its main conclusion:

Maybe he’s right and the fact that big companies are contributing to its code as well as they also are supporting and using it to deliver services it’s unusual, but let me mention some of them that are supporting and using it: RackSpace and Nasa (maybe they aren’t the biggest, but they were the first ones), IBM (IBM’s open cloud architecture), HP (HP Cloud Services), Cisco (WebEx Service), they don’t seem small player, do they?  (I beg your pardon for the irony). Besides, relatively smaller companies are contributing to, supporting and selling services on OpenStack as the traditional Linux Distro Providers: RedHat, Novell (Suse), Canonical (Ubuntu). Finally other big player that are using OpenSack are PayPal-eBay, Yahoo, CERN (the European Organization for Nuclear Research), ComCast, MercadoLibre, Inc. (e-commerce services), San Diego Supercomputer Center, and so on., that aren’t small player either …

Could you think in player in the IT providers market as big as the firsts mentioned? Sure, Microsoft, Google, Oracle, … Well, surprise, last week Oracle announced that they embrace OpenSatck. Yes, although Oracle acquired Nimbula on March (and maybe the Nimbula shift from its own proprietary private cloud approach to become an OpenStack-compatible software supplier was the first sign of the change), they are going to integrates OpenStack cloud with its technologies: “Oracle Sponsors OpenStack Foundation; Offers Customers Ability To Use OpenStack To Manage Oracle Cloud Products and Services”. Oracle’s announcement said that:
  • Oracle Linux will include integrated OpenStack deployment capabilities.
  • Solaris too will get OpenStack deployment integrations
  • Oracle Compute Cloud and Oracle Storage Cloud services will be integrated with OpenStack
  • Likewise, Oracle ZS3 Series network attached storage, Axoim Storage Systems, and StorageTek Tape Systems will all get integrated.
  • Oracle Exalogic Elastic Cloud hardware for running applications will get its own OpenStack integration as well.
  • And so on.
i.e. Oracle speaks about a significant new support for OpenStack in an extremely ambitious manner, pretty much saying that it would support OpenStack as a management framework across an expansive list of Oracle products. Evidently, Oracle movement is a great support for OpenStacck (and for my thesis, too, and probably another point against Mr. Pirelli’s opinion) …

However, to be honest,  let me doubt (at the moment) about the ultimate motivations and objectives of Oracle: I’ve got the impression that Oracle is simply ceding to the pressing of the market, adjusting to the sign of the times, but it’s not committed to what makes OpenStack means: a collaborative and inclusive community: On one hand,  as I stated that in my  “Cloud Movements (2nd part): Oracle’s fight against itself (and the OpenStack role)”  post that Oracle is fighting against itself due to its traditional and profitable business model is challenged by the Cloud model, and it has been delaying its adoption as much as possible (as IBM did when its mainframes ran mission-critical applications on legacy databases, and a new -by then- generation of infrastructure vendors -DEC, HP, Sun, Microsoft and Oracle- challenged it and disrupted the old IBM model): it was conflicted about selling the lower-priced, lower-margin servers needed to run them (even Oracle CEO Larry Ellison used to disdain Cloud Computing, e.g. he called cloud computing “nonsense” in 2009). On the other hand, the recent Oracle announce doesn’t necessarily imply a change in this matter.

Besides the Oracle movement raise suspicions, even disbelief, not only in me but in other people. Let me quote some paragraphs of Randy Bias’ (co-founder and CEO of cloud software supplier CloudScalin post titled “Oracle Supports OpenStack: Lip Service Or Real Commitment?”. Randy’s position could be summarized in its words Oracle is the epitome of a traditional enterprise vendor and to have it announce this level of support for OpenStack is astonishing”. Randy also wonders “Can Oracle engage positively with the open-source meritocracy that OpenStack represents? Admittedly, at first blush it’s hard to be positive, given Oracle’s walled-garden culture.” And to back its answer, Randy review some Oracle facts:
  • Oracle essentially ended OpenSolaris as an open-source project, leaving third-party derivatives of OpenSolaris (such as those promulgated by Joyent and Nexenta) out in the cold, having to fork OpenSolaris to Illumos.
  • Similarly, the open-source community’s lack of trust can be seen ultimately in the forking of MySQL into MariaDB over concerns about Oracle’s support and direction of the MySQL project. Google moved to MariaDB, and all of the major Linux distributions are switching to it as well”.


However, finally Randy concludes: It’s hard not to have a certain amount of pessimism about Oracle’s announcement. However, I’m hopeful that this signals an understanding of the market realities and that its intentions are in the right place. We will know fairly soon how serious it is based on code contributions to OpenStack, which can be tracked at Stackalytics. (So far, there are zero commits from Oracle and only two from Nimbula, Oracle’s recent cloud software acquisition.). Personally, I’m happy to see Oracle join the party. It further validates the level of interest in OpenStack from the enterprise and reinforces that we’re all building a platform for the future”.

And the last words of Randy gets me back to my initial point: I really think OpenStack is already a mature enough platform to make business (in all the ways other IT products or solutions) as the giants and other big companies of IT area are showing (IBM, HP, Cisco, Oracle, RackSpace, Yahoo, PayPal, ComCast, RedHat, Novell, Canonical, etc.).

Finally, let me end this post with some partial pictures extracted from an Infographic elaborated by OpenStack (you can get the whole infographic here):

The current OpenStack deployment comprises 56 countries:
Current OpenStack Deployments

Covering any-size organizations and a wide range of industry sectors:
Current OpenSatck Organizations Size        Current OpenStack Industry Sectors

Besides, any type of deployments is currently made:
Current OpenStack Type of Deployments
And currently the 10 types of applications most deployed on OpenStack are:
Current OpenStack Type of Workloads

jueves, 28 de noviembre de 2013

Tissat y Datapoint Europe renuevan su acuerdo de comercialización del "Hosted Contact Center"

La renovación de este acuerdo, que les permite intercambiar los servicios que ambas empresas prestan a sus respectivos clientes, coincide con la puesta en marcha del Cloud Services Center de Tissat.

Datapoint Europe, multinacional especializada en soluciones de negocio para el Contact Center, y Tissat, empresa especializada en el outsourcing de servicios de misión crítica, han renovado el acuerdo suscrito hace un año para la comercialización del Hosted Contact Center.
El citado acuerdo les permitirá ratificar la alianza que ya mantenían como partners, no solo a nivel de alojamiento, sino intercambiando los servicios que ambos prestan a sus respectivos clientes, de forma que puedan plantear una oferta global a nivel de infraestructuras y servicios de Contact Center. Hasta el momento se ha virtualizado la plataforma de Datapoint Europe, integrando servicios de valor para todos los usuarios y empresas que necesiten contact center.
“Los nuevos servicios de TISSAT nos están ayudando a maximizar el valor de nuestra oferta de soluciones On Demand, permitiéndonos la flexibilidad y competitividad que el mercado nos requiere en estos momentos”, según José Luis Aroza, Sales Director de Datapoint Europe en España.
La renovación de este acuerdo coincide con la puesta en marcha del Cloud Services Center de Tissat. La solución Cloud Asistance Center supone la puesta en marca de un nuevo modelo de servicios Cloud, sobre la plataforma de Cloud Computing sobre Openstack, Néfeles, creada por Tissat. En palabras de Carmen García, Directora de Tissat Madrid, “se trata de una plataforma IaaS auténtica, que soporta el autoservicio bajo demanda, rápida elasticidad y ofrece servicios medibles, accesibles por Internet y multitenancy”.
Y todo esto es posible gracias a la Infraestructura de referencia de TISSAT, Walhalla, uno de los centros de proceso de datos más avanzados de Europa por Seguridad y Eficiencia Energética y permite ofrecer a los clientes la seguridad y confianza que ofrece un centro TIER IV, certificado por el Uptime Institute.
Datapoint Europe es una empresa de consultoría e integración de sistemas especializada en la optimización de las interacciones cliente-marca. Con casi 50 años  de experiencia en el mercado, tiene su sede en Madrid y filiales en Alemania, Benelux, Italia y Francia. Más de 200 clientes en Europa, con presencia activa en banca, telecomunicaciones, seguros, transporte e industria, sitúan a Datapoint Europe entre las compañías líderes europeas en su área de especialización, el Contact Center. (www.datapointeurope.com)

miércoles, 20 de noviembre de 2013

Analysis of several Cloud Adoption Studies: fostered by Customer Demand (and other drivers)

A 451 Research study reports that market for cloud services is growing rapidly (as showed in an Interxion study), predicting a CAGR (Compound Annual Growth Rate) of 24% from 2011 to 2015. Besides, that report compares the cloud market with the traditional hosting market. (Note: The hosting market consists of dedicated hosting and managed hosting; and to be able to make a comparison between cloud and the hosting market the study have chosen to leave out SaaS from Cloud market and reduce cloud computing to IaaS and PaaS). When comparing the cloud computing to the traditional hosting market, as showed in the next figure, we see that although cloud computing share is growing rapidly (CAGR of 42%), with a total value of $4.8bn in 2012, it is still a relatively small share (18%) compared to the hosting market:
 
Hosting vs Cloud-by-Interxion
 
The above table shows how the provision of infrastructure services is still dominated by hosting providers offering traditional hosting services, but cloud growing figures let foresee that cloud-based technologies will start to overtake the traditional market.
 
Besides, IT spends are shifting from traditional IT services from Cloud Services:
 
Traditional IT Services spends vs Cloud Services spends-by-Aerohive Networks
 
 
Besides, according to a study developed by PC-Connections, 69% of organizations are considering implementing Cloud or already have some application in the cloud:
 
Cloud Implementation-by-PC Connection
 
And 50% of organizations have assessed its environments to determine if they are suitable for Cloud:
 
Assessment of your IT environemt for Cloud-by-PC Connection
 
 
Therefore, let’s go deep in the Cloud Market, and we’ll review its current drivers and barriers; given that in the last year I‘ve treating direct or indirectly different sorts of Cloud barrier in several posts (“Cloud Computing Countries Ranking, or the Cloud Confusion even among market analyses: BSA vs Gartner vs IDC, Interoperability: a key feature to ask your Cloud Service Provider for, An infographic about Security and other Cloud Barriers, Cloud Computing and the EU Digital Agenda: A step in the right way, but too shortand so forth) let me focus in the drivers, and we’ll do it extracting data (as done in the prologue of this post) of several cloud market reports, that in some cases show different numeric results, even contradictory, but that they agree in the main: the relationship and importance of the main drivers.
 
According to the third annual Data Center Industry Survey” (2013) of The Uptime Institute, there are a lot of factors driving public cloud adoption, from speed of deployment, scalability and potential cost savings. But the breakout driver for cloud computing adoption in 2013 is end-user or customer demand. In 2012, only 13% of respondents listed customer demand as a top driver, versus 43% in 2013, making it the leading driver over all other factors driving public cloud deployments:
 
Top cloud drivers-The Uptime Insitute
 
 
Similar results are obtained by the study conducted by Orange, although the figures (and consequently the ranking) are different:
 
Top reasons for Implementing Cloud-by-Orange
 
 
Referring to this subject, a study conducted by Forrester Research (in behalf of IBM) shows that the 2 top applications that companies are interested in moving to the Cloud are clearly driven by the user: both the external customer and the internal employees:
 
Top 2 Types of applications to host on Cloud-by-Forrester
  
 
In consequence all of them agree on:

the importance of cost reduction as well as the growing importance of focusing in the customer demands as key drivers.

 
Besides, curiously, there are gradual changes in the way enterprises procure technology. With or without the blessing of IT, departmental and line-of-business managers are increasingly going direct to providers for SaaS apps or IaaS offerings.  In fact, Gartner forecasts that the CFO (Chief Financial Officer) will spend as much on technology as the CIO (Chief Information Officer) by 2017. A lot of that investment will go to customer-facing “systems of engagement”, mainly relating to e-commerce, which needs cloud infrastructure to scale properly and meet the highly variable demands of public Web and mobile apps.
 
 
Finally, directly in relation with the last sentence, according to an Aerohive Networks infographic the 3 main advantages of using Cloud Services are:
  1. Instant Scalability
  2. Fast Deployment
  3. Automated backup & updates
top advantages of Cloud Services-by-Aerohive Networks
 
 
And clearly the first two of them are aligned (and the third too) with the increasing user demand which forecast is based on, at least, the next points:
  • 90% of organizations will support corporate applications on personal devices by the end of 2014  (Gartner, “Plan Now for the Hyperconverged Enterprise Network”, May 2012) ,
  • 1,04 billions of smartphones and tablets will be shipped in 2014  overcoming for first time the number of normal mobiles (IDC, Worldwide Quarterly Mobile Phone Tracker,)
  • and Morgan & Stanley estimates that the mobile web will be bigger that desktop internet by the end of 2015.

90% of organizations will support corporate applications on personal devices

viernes, 15 de noviembre de 2013

Personal Data Privacy & Europe’s Cloud Regulation: RESIGNATION??? (a personal conclusion extracted from the VII National ISACA Congress)

This is the third and probably last one of this consecutive series of posts I’ve dedicated to Personal Privacy.
 
 As I already said in previous posts, these days fortunately I’ve caught in an “work jam” (I’ve said “fortunately” because I leave in a country, Spain, where the current unemployment rate is about 25%, difficult to understand for anyone, but fortunately it seems to be changing). This work-jam is the reason this post has been delayed so far, despite after the meeting I’m going to speak about I felt compelled to write immediately, just after finishing it, but you know, I have to meet other obligations.
 
The fact is that yesterday I was invited by an excellent professional and business man, Javier Peris (a good friend of mine) to the VII National Congress of IT Government, Auditing and Security and (Congreso Nacional de Auditoría, Seguridad y Gobierno de TI) organized by ISACA at my born town, Valencia (the third biggest city in Spain). In spite of the work-jam I decided to attend because of friendship. The well-structured and interest of the subjects to be covered and the quality of the speakers and other involved professional as well as stakeholders. As in precedent Congresses, this year all of the speakers were great too, and these is the reason my worries during and after the meeting are stronger. (By the way the Congress ends today but I’m not been able to attend; so maybe some of my worries could be solved today).
 
Going to the subject, one of the most appealing event (in my opinion and in spite of other interesting subjects focused by other speaker as Carmen Serrano, Florencio Cano, Javier Zubieta o Javier Cao) was a round table about “Cyber War”.
 
During the whole discussion I was amazed to discover no one face up the unfortunate recent facts disclosed by Snoweden. I thought, perhaps, people were afraid the discussion became about political issues instead of the technical aspects and business consequences. ISACA’s Congress (as this blog is) is a technical meeting, but treating that subject is very easy that the discussion evolves toward important political issues related with the subject. Due to I share that fear about the evolution of the discussion I decided to wait (and people who knows me will guess how difficult it was to me) to see when someone would introduce the argument that Europe has been (probably it follows currently) cyber attacked by the United States of America.
 
Let me say it once again, I’m not going to discuss if we can be allied in the NATO with a country that spies our Europe Prime Ministers as well as our business leader(and take advantage of it, as themselves recognized for the cases of Brazil or Japan espionage), neither I’m going yo discuss if USA behavior is evolving toward a “policy state” and/or Aldous Huxley’s “Big Brother” society, HOWEVER I really wonder (because that was was one of the other subjects treated in the ISACA Congres) if Europe can keep signing the Safe Harbour  agreement with US about complying with the EU Directive 95/46/EC on the protection of personal data. I also wonder myself how we can “sell” security prevention, assessment, auditing and consulting tasks about “data privacy” knowing not only hackers but Governmental agencies under NO-Legal-control can break and the latter infringe it with complete impunity.
 
Recently, in my last post, titled Personal Data Privacy & Europe’s Cloud Regulation: the privacy approach (Spain and other European countries are the leaders)”,as its title announces I showed how Spain and Other European countries are the in the firsts position of the privacy protection ranking. Here, in Spain, we have the LOPD law that fully agrees and math the EU Directive 95/46/EC on the protection of personal data; besides the Spanish Public Administration must follow the “National Security Layout” (ENS or “Esquema Nacional de Seguridad”) and recently it has been released a law for securing “critical industries”. All of them are good (although many people think they could be better) because of its focus on improve IT security of subjects that “affect” to the citizens (in one way or other).
 
Consequently, in summary  and without going deep in this subject) they are also good for today business in, at least, two ways: citizens will trust in, and also because it fosters business about how to implement the appropriate security measures, to meet the regulation compliances, and to audit all of them (some of the ISACA Congress speakers treated these points). So I wonder myself how no on introduce early the problems of consequences of US behaviour.
 
Therefore, at the end, I decided to deliver the question to the round table. And the conclusion of the answers, and of the silences, was VERY WORRYING:
 
“RESIGNATION” !!!
 
And now it's when I understand better why (although very slowly) the European Commission wants to regulate more strictly about some related subjects, despite that measures (as I stated in the post titled Personal Data Privacy & Europe’s Cloud Regulation: the dilemma) may cause a negative impact in both business and innovation.

miércoles, 6 de noviembre de 2013

Personal Data Privacy & (Europe’s) Cloud Regulation: the privacy approach (Spain and other European countries are the leaders)

The more the data disclosed by Snowden are analyzed, the bigger Personal Data Privacy worries become, as the news from my last post are showing; so let me come back to the subject, but from another point of view.
 
In my last post, speaking about the dilemma between Personal Data Privacy and Europe’s Cloud Regulation, we simplify the problem and make a trick: we mix any kind of personal data from basic data (name, age, sex, …, phone numbers, addresses: post, e-mail, social networks, etc.), to phone and internet conversations and communications, trough hobbies, preferences, likes and so on.
 
Of course a lot of legal and ethical business can be done with those data (if you decide to make them public): from direct one-to-one marketing that offers only what you can really be interested in (e.g. adventure travels if you love them) and doesn’t disturb you what anything else (e.g. not offering you meal foods if you are vegetarian), to corporate image watching or legal technology watching, through social network based results forecasts, and so on. But also it must be ensured that these data are not used to discriminate you on the basis of your religion, political or sexual preferences for mentioning only a clear example. So I think all of us will agree that some protection is needed, especially when we are speaking about human and civil rights. In reverse, copying from other blog, maybe you can agree with Britain’s Foreign Secretary, William Hague who last June said: “If you are a law-abiding citizen of this country going about your business and your personal life you have nothing to fear about the British state or the intelligence services listening to your phone calls or anything like that”, but my perception is that a lot of citizens (as me) will think that those unfortunate words are laying the foundations for a dangerous police-state mentality.
 
The BSA (Business Software Alliance), an organization I have a lot of discrepancies with (because of the way they use to get its goals), in the beginning of the year published a report (see my post titled “Cloud Computing Countries Ranking, or the Cloud Confusion even among market analyses: BSA vs Gartner vs IDC” on 25th March 2013) about the best 24 countries prepared for the Cloud. The countries were scored taking into account their laws and regulations for provision of cloud watching seven areas: 1.- data privacy, 2.- cyber security, 3.- cyber crime control, 4.- preservation of intellectual property, 5.- technology interoperability and legal harmonization, 6.- free trade, and 7.- infrastructure IT; in other words, if they have a comprehensive suite of modern laws that support and facilitate the digital economy and cloud computing. And its result was that the top ten countries are Japan, Australia, Germany, United States, France, Italy, UK, Korea, Spain, and Singapore. (Please, note the big European countries presence: Germany in the 3rd place, and France in the 5th, Italy in the 6th, United Kingdom in the 7th and Spain in the 9th, contradicting in some way the aforementioned Gartner report). Moreover, curiously BSA does take into account as first criterion “data privacy”, so I wonder what its weigh was in the final score, because it isn’t the strong point of the United States, is it?. So I think we need to explore deeper this “data privacy” criterion …
 
Focusing only on the latter subject, I mean the data privacy criterion, the top 5 countries ranked best for privacy are Spain, the Czech Republic, Iceland, Norway, and Slovenia, according to BackgroundChecks.org that uses 6 criteria (the fist 4 are positive and the 2 last are negative) to rank them:
  1. Government has privacy laws
  2. There are fines for violating privacy laws
  3. Government actively protect free speech
  4. Government does not restrict access to Internet
  5. Government use spyware
  6. Government filters or censors the Internet
The Privacy Scoreboard
 
This picture is extracted from an infographic that you can find here: http://techcitynews.com/2013/10/15/these-5-countries-were-ranked-best-for-privacy-infographic/.
 
This infographic also has an area dedicated to the countries that spies to its citizens as US, China, Malaysia, Syria, Nigeria, Iran and Bahrain, explaining the reasons why they have the dubious honour of being placed in this shame corner. (Note: you can get it in the above reference/link)

miércoles, 30 de octubre de 2013

Personal Data Privacy & (Europe’s) Cloud Regulation: the dilemma

Currently the Personal Data Privacy is on everyone’s lips after this summer revelation by Edward Snowden about a US Government data collection program called PRISM, and it becomes even more fashionable right now that the analysis of released top secret documents have  shown  the extent of spying by the National Security Agency (NSA) on electronic communications has reached some European Prime Ministers, or for mention another example “Le Monde” journal has revealed that the NSA gathered more than 70 million French phone calls in a single month “targeting not only people suspected of being involved in terrorism but also high-profile individuals from the world of business or politics”.
 
So, I’d like to come back in this blog to the Data Privacy and its relationship with the intrinsically free Cloud data movements, and the possible impact on a (not wished, but perhaps needed) Cloud Regulation: for example, the European Parliament busied itself attaching amendments to its data privacy regulation before Snoweden’s revelations and now is weighing to address cloud computing that will actually signify a Cloud Regulations. In short, it’s basically a dilemma between:
  • On one hand, it is about the progress, the technical advances and the global business that Cloud technologies can foster. In a post written about a year ago (“Europe behind the US on Cloud”),  analyzing the Gartner’s report about why Cloud penetration is more delayed in Europe than in US and according Gartner, it was stated that a possible cause was these Personal Data Privacy Regulations that were seen as a protectionist barrier that precludes Cloud business growing (basically because of the Europe’s diverse and ever-changing data privacy regulations inhibit the movement of personal data to the cloud, and EU policy-making processes and practices can hinder business). And it’s clear to me that, in exchange, also non EU companies (mainly American ones) are suffering this policy because they become less competitive having to adapt their products and/or services to E.U. privacy laws. Therefore, at the end, business and technical advances are slowing …
  • On the other hand, it is about human and civil rights. As an European citizen, I’ve got no doubt about some data privacy protection is needed, without which Aldous Huxley’s “Big Brother” world will happen and police-state mentality will success. Even, someone perhaps could to persuade me that it might be fair for Governments to have access to our private communications via the internet, in some circumstances under the right and well-known conditions and under the control of a trustworthy independent judiciary. It’s difficult to debate about. And, at the end will be driven to the important and even more difficult debate about how democracy can protect itself (from terrorism and other radical ideas) without leaving been “democracy” (in other case, terrorism will have won the war, even it loses the battles). But this a technical post, so let me keep close to technical/economical subjects.
 
Of course some people (in both sides of Atlantic, but more in the west side) will think that these European laws are less about data security and more about limiting the power of American corporations and making easier the growing of European companies. However many EU officers and Parliament members have states that “it’s not about protectionism but about ensuring customers will receive the proper level of guarantees in terms of data protection and access across Europe”, because as Neelie Kroes (the European Commission vice president in charge of telecommunications and information policy) said, “we need to realize that European citizens will not embrace the cloud if they are worried for their privacy or for the security of their data”. And I share these ideas.
 
Furthermore, about this economic impact, it also be noted that, in reverse, NSA is accused of conducting industrial espionage in countries all around the world, even allied countries, and the reason to do that is “we collect this information for many important reasons: for one, it could provide the United States and our allies early warning of international financial crises which could negatively impact the global economy. It also could provide insight into other countries’ economic policy or behavior which could affect global markets” (I’ll come back to this subject in further post).
 
Besides, another point to be taken into account is that personal data are been monetized in different ways by a lot of companies. Two very different factions of people exist, one who values privacy and one that could care less (and, of course, in the middle a lot of variants): for some people, privacy is not valued (they do no bear that its personal data are monetized and shared across platforms), but for others, privacy is sacred (they will even restrict their online presence and social networking). The problem, shown with clarity by the Snowden’s disclosures, is that neither faction actually knows much about what the US government (and other companies) can access and what it cannot and what is the real and full usage that is going to be made with those data.
 
In the other side, there’s a risk of going too far and effectively putting a significant barrier to business, and in the current economic situation that could have a broader and negative impact in European and non European companies and businesses. So, finding the balance is key and it’s not easy to solve this dilemma between Personal Data Privacy and Business Regulation, even harder when the business is around a technology like the Cloud where free movements of data is intrinsic a one of its advantages, so they can travel (or be copied for availability reasons) from a country to another changing the jurisdiction over them and the laws to be applied.
 
And another conclusion is that also data security must be improved (the use of strong encryption that can protect user data from all but the most intense decryption efforts).
 
Finally, another worrying reflection to made is that NSA has shown that it is also subjected to the same risks of Data Loss (it doesn’t matter the way) as any other business company, and Snowden is certainly not the only one who had access to those private data of other people …

jueves, 17 de octubre de 2013

"Hybrid Cloud is to today’s Enterprise what Terminal Emulation was to Mainframe”

“Hybrid Cloud is to today’s Enterprise IT what Terminal Emulation was to Mainframe” (from @reillyusa)

Measuring Carbon footprint of #CloudComputing workloads

Measuring Carbon footprint of #CloudComputing workloads and #Green IT #ODCA @opendatacenter http://goo.gl/2XlLX4

OpenStack HAVANA release has been delivered today

Please, let me copy this e-mail sent by Thierry Carrez to all OpenStack community:
 
Hello everyone,
 
It is my great pleasure to announce the final release of OpenStack 2013.2. It marks the end of the “Havana” 6-month-long development cycle, which saw the addition of two integrated components (Ceilometer and Heat), the completion of more than 400 feature blueprints and the fixing of more than 3000 reported bugs !
 
You can find source tarballs for each integrated project, together with lists of features and bugfixes, at:
OpenStack Compute:        https://launchpad.net/nova/havana/2013.2
OpenStack Image Service:  https://launchpad.net/glance/havana/2013.2
OpenStack Networking:     https://launchpad.net/neutron/havana/2013.2
OpenStack Block Storage:  https://launchpad.net/cinder/havana/2013.2
OpenStack Identity:       https://launchpad.net/keystone/havana/2013.2
OpenStack Dashboard:      https://launchpad.net/horizon/havana/2013.2
OpenStack Metering:       https://launchpad.net/ceilometer/havana/2013.2
OpenStack Orchestration:  https://launchpad.net/heat/havana/2013.2
 
The Havana Release Notes contain an overview of the key features, as well as upgrade notes and current lists of known issues. You can access them at:
 
In 19 days, our community will gather in Hong-Kong for the OpenStack Summit: 4 days of conference to discuss all things OpenStack and a Design Summit to plan the next 6-month development cycle, codenamed “Icehouse”. It’s not too late to join us there, see http://www.openstack.org/summit/openstack-summit-hong-kong-2013/ for more details.
 
Congratulations to everyone who contributed to this development cycle and participated in making this awesome release posible!

lunes, 30 de septiembre de 2013

Europe is leading DCIM adoption (analyzing two recent DataCenter research surveys of Uptime Institute and IDC)

In a recent post (titled “Third-Party DataCentres are becoming increasingly attractive rather than Enterprise DataCentres, according to a recent Uptime Institute’s Survey“) I refer to the The Uptime Institute’s Data Center Industry Survey” (2013): http://uptimeinstitute.com/2013-survey-results. Among other points, that survey analyzes DCIM software adoption and show that is high this year, reaching a 38% among respondents, with the major driver being capacity planning; however in 2012, in spite of enormous interest in and that a fair percentage were already using tools that could be described as DCIM, there was no single concerted focus on what DCIM needed to be deployed.
 
DCIM-Level of adoption-by-Uptime Institute
 
 
Besides the adoption rate is even higher in Europe, which leads global DCIM adoption at nearly 50%. Also, large organizations and colocation companies have adopted DCIM in the 50% range.
 
As Uptime Institute recognizes, both industry-watchers and DCIM vendors have suggested that 38% adoption is too high. So where is the discrepancy? 2 facts could clarify the subject according to The Uptime Institute:
  • Uptime Institute’s audience base tends to be larger, more advanced data center owners and operators – Uptime Institute Network members, Tier-certified         data center owners, etc. – therefore, more likely to be ahead on leading-edge technology adoption.
  • Also, the term DCIM has been used to include anything from a home-grown system of spreadsheets and monitors, to an advanced suite of fully integrated tools spanning multiple data centers.
Please, let me analyze the latter point. In order to solve the problem, this year Uptime Institute provided respondents with 451 Research’s definition of DCIM before the question, in order to provide clarity:
DCIM is defined as a data center-wide or organization-wide system or suite that collects and manages information about a data center’s assets, resource use and operational status.
By the way I think Gartner’s definition is quite better and clarifying:
DCIM is the integration of information technology and facility management disciplines to centralize monitoring, management and intelligent capacity planning of a data center’s critical systems. It enables a common, real-time monitoring and management platform for all interdependent systems across IT and facility infrastructures. (Note: Therefore, it enables the monitoring and collection of low-level infrastructure data to enable intelligent analysis by individuals with domain expertise (e.g., capacity planners and facilities planners), as well as holistic analysis of the overall infrastructure, and in consequence, a full DCIM solution provides detailed monitoring and measurement of data center performance, utilization and energy consumption, supporting more-efficient, cost-effective and greener environments).
So, The Uptime Institute asked respondents specifically to exclude spreadsheets, BIM-type drafting programs and basic BMS systems. But I think do not eliminate partial solutions without intention or possibility to afford the whole dimension of DCIM.
 
 
Now let me introduce a new survey, carried out by IDC and this is summarized in the infographic that you can find here.
 
DCIM-Unified management-by-IDC
 
 
According to IDC survey 92% of DataCenter manager see DCIM as the solution to traditional DC problems:
  • Inconsistent Data Information.
  • Divided Data Center Operations.
  • Inconsistent Data Center Maturity.
DCIM-traditional DC problems-by-IDC
 
And those problems result in that 84% of data centers have had issues with power, space, cooling capacity, assets and uptime. These issues went on to negatively impact the business process:
  • 31% – Delay in application rollouts
  • 30% – Disrupted ability to provide service to customers
  • 27% – Forced to spend unplanned OPEX budget
  • 26% – Need to roll back an application deployment.
 
In the other hand, coming to the Uptime Institute survey and concretely to one of the points stated in the first paragraph of this post, 70% of respondents listed “improving capacity planning” as a top driver for buying DCIM. All other drivers were distant runners up. Capacity planning mistakes will be expensive, and could cripple a business. The industry is hungry for any solution that will help make this exercise less of a guessing game. Also, in second place (a 30%), is concerned with indentifying availability threats in advance:
 
DCIM-adoption drivers-by-Uptime Institute
 
And when asked about the features wished in DCIM the responses were:
 
DCIM-Level of adoption of features-by-Uptime Institute
 
Answers that fit perfectly with the survey done by IDC that concludes that about 70% of respondents wish the next 4 functionalities:
  • Real-time monitoring, including power and temperature
  • Alerts and alarms for power and cooling
  • Inventory and asset management
  • Capacity analysis and planning
DCIM functionalities desired-by-IDC
 
Finally, changing the subject, concerning to barriers to adopt DCIM Over 60% of Uptime Institute respondents globally said cost was the primary barrier to DCIM adoption, and the second one (but with a 37%, i.e. far from the first one), the integration of DCIM with the existing systems:
 
DCIM-adoption barriers-by-Uptime Institute
 
And the fact is that while small companies are deploying inexpensive DCIM tools (48% report spending less than $100k), the largest companies are spending significantly more for scalability and features (24% report spending over $300k):
 
DCIM-cost for purchase-not subscription-by-Uptime Institute
 
Note: in the above tables cost doesn’t reflect the investment in human resources needs (from training to specialized technicians and so on).

miércoles, 18 de septiembre de 2013

Tissat resulta adjudicataria del Acuerdo Marco 27/2012 para la contratación de Servicios de Alojamiento de Sistemas de Información

Tissat ha sido homologada por la Dirección General de Patrimonio para actuar como proveedor de servicios en contratos con la administración pública en base al actual catálogo de servicios 27/2012. Esta nueva adjudicación le permite trabajar con la administración pública como proveedor de servicios relacionados con el Alojamiento de Sistemas de Información (housing, hosting dedicado, hosting virtual, Cloud…).
 
Este éxito se fundamenta en que Tissat gestiona y opera en estos momentos 5 DataCenter:
  1. “Walhalla”, sito en Castellón, certificado como Tier IV por The Uptime Institute.
  2. “Parc Tecnologic”, sito en Paterna (Valencia), Tier III.
  3. “Brasil”, sito en Madrid.
  4. 112-RM, sito en Murcia.
  5. 112-CV, sito en L’Eliana (Valencia).
De ellos los 3 primeros son de su propiedad, y los 3 han sido homologados dentro del Acuerdo Marco 27 de Catálogo de Patrimonio (“Servicios de Alojamiento de Sistemas de Información”) que se acaba de adjudicar hace unos días, obteniendo además el primer puesto en la evaluación, y que contempla principalmente los siguientes servicios:
  • Servicios de alojamiento de sistemas de información (housing, hosting dedicado, hosting virtual, cloud hosting, etc.).
  • Servicio de puesta en marcha asociado a los servicios de alojamiento.
  • Servicio de respaldo y recuperación asociado a los servicios de alojamiento.
  • Servicio de comunicaciones asociado a los servicios de alojamiento.
  • Servicio de monitorización asociado a los servicios de alojamiento.
  • Servicio de explotación asociado a los servicios de alojamiento.
  • Servicio de seguridad asociado a los servicios de alojamiento.
  • Servicio de estadísticas e informes.
 
Por otra parte, el diseño de estos 5 centros DataCenter ha sido realizado también por Tissat. De la calidad de los diseños de Tissat da fe los premios que ha obtenido, así por ejemplo la revolución que supone Walhalla queda avalada por la obtención en Diciembre del 2010 en Londres del prestigioso premio “Innovation in the medium Date Centre” en los Data Centre Leaders Awards, gala en la que se encuentran cada año los proyectos más innovadores y las apuestas más rompedoras del marco europeo de los Centros de Procesos de Datos. Además, Tissat no sólo obtuvo un galardón, sino que también quedó finalista en la categoría de “The Green Data Centre”, hecho que demuestra el enorme compromiso de Tissat con el Medioambiente y su apuesta por el ahorro energético, como lo demuestra el PUE de 1,15 con el que opera Walhalla.
 
Además Tissat ha certificado la gestión y las operaciones de todos sus DataCenter de acuerdo a las siguientes normas:
  • ISO 20.000
  • ISO 27.001
  • ISO 50.001
  • ISO 14.001
  • ISO 9001
  • EU Code of Conduct for DataCentres.
  • Y la reciente (publicada a primeros de año) AENOR EA 0044:2013, que certifica la sostenibilidad energética de un CPD, y que hemos sido los primeros en obtener.
 
En palabras de la Directora de Tissat Madrid, Carmen García, “con esta adjudicación Tissat se sitúa a la vanguardia de las mejores compañías TIC a nivel nacional y demuestra una vez más su capacidad y solvencia financiera, tecnológica y profesional pues la evaluación de nuestros DataCenters y servicios ha superado todos los criterios de evaluación con Matrícula de Honor”. 

jueves, 12 de septiembre de 2013

Third-Party DataCentres are becoming increasingly attractive rather than Enterprise DataCentres, according to a recent Uptime Institute’s Survey

The Uptime Institute, a division of The 451 Group, recently has published the results of its third annual “Data Center Industry Survey” (2013). The survey was developed to collect data on an annual basis around Digital Infrastructure deployment trends, procurement plans, operations and management practices, and other topics that impact the mission-critical data center industry.
Note: You can get a free copy of 2013 report at http://uptimeinstitute.com/2013-survey-results and the 2012 one at http://uptimeinstitute.com/2012-survey-results.
 
This year, the survey  gathers responses  (during spring 2013) from 1,000 data center facilities operators, IT managers and senior executives from around the world (mainly from North America but with a growing number from Europe, Asia and Latin America) and most of them have responsibilities for more than one site. The survey covers different areas as DC budgets, Cloud Computing, Energy Efficiency and Green certifications, DCIM adoption, and so on.
 
It shows consolidation in tendency in most areas but also some surprising trends changes:
  • The global average of the PUE numbers the respondents were seeing was reported to be at 1,65, a great reduction from the 1,98 reported in 2012 survey, or the one of 2.5 reported in 2011.
  • The percentage of companies pursuing green data center certifications like US Green Building Council’s LEED program or Energy Star grew from 48% in 2012 to 58% in 2013.
  • Furthermore, the survey found that Enterprise public cloud adoption rose from 10% in 2012 to 17% in 2013, however private cloud adoption decrease slightly: only 44 per cent deploying private clouds this year, compared with 49 per cent in 2012. According to survey “This seems to suggest that the companies who could make use of a private cloud platform have made the investment, and companies on the fence are either going to public cloud or walking away from the hype cycle” . It also said that large enterprises are almost twice as likely to implement public clouds than smaller enterprises.
  • Prefab modular DC designs are on the slide: 53% of respondents said no interest from the last year 42%; and adoption is still around 8% as in the 2012 survey (however 2012 doubled 2011).
  • Reported DCIM software adoption is high, a 38% among respondents, with the major driver being capacity planning. (Note: however in 2012, in spite of  enormous interest in and that a fair percentage were already using tools that could be described as DCIM,  there was no single concerted focus on what DCIM needed to be deployed).


However, probably, the new and most significant result found is that are the most significant DataCenter budget grow is occurring in the “third party” companies (77% of third-party data centre providers said they had received large, 10% or more, year-over-year budget increases this year, compared with just 47% of enterprise data centres), reflecting a shift in spending away from enterprise-owned data centres and toward outsourced options. According to Matt Stansberry, Uptime Institute director of content and publications, “This isn’t the end of the enterprise-owned data centre, but it should serve as a wakeup call. Going forward, enterprise data centre managers will need to be able to collect cost and performance data, and articulate their value to the business in order to compete with third-party offerings”.
 
Note: The Uptime Institute defines “third party” in this surveys as “companies that provide computing capacity as service in any form” (SaaS or other cloud computing services, multi-tenant collocation, or wholesale DC providers).

martes, 23 de julio de 2013

The European Standard for Innovation Management (UNE-CEN/TS 16555-1:2013) has been recently published

On June, 2013, was approved the UNE-CEN/TS 16555-1:2013, the European Standard for Innovation Management, and finally on July, the 4th was finally published.
 
As Oana-Maria Pop (Senior Editor of IM) stated at the beginning of this year: “As evidence suggests, innovation – both as a field of study and as a practical discipline – has gained considerable traction over the past 15 years. As organizations become broader and more complex, the need for a systematic approach to new product, service or business development techniques strengthens accordingly. And this is merely the tip of the iceberg – there are multiple insights on why communities can benefit from an organized approach to innovation – a process at the heart of our everyday (business) lives. As of today, this approach is becoming a reality and work slowly progresses towards the creation of a unified Innovation Management Standard. Why the delay? Plainly because relating innovation to standardization is something most practitioners still find counter-intuitive”.
 
But that work successfully concluded this last June. In that successful result some stakeholders and supranational bodies such as AENOR in Spain and SIS in Sweden were crucial, integrated and collaborating inside the CEN/TC 389 Innovation Management technical committee, that was created (by CEN) in November 2008 to support a culture of innovation in Europe and accelerate the access of innovation to both domestic and global markets. Note: similar to the worldwide ISO, at the European level, Brussels-based CEN (the European Committee for Standardization) is the solely recognized organization for the planning drafting and adoption of European Standards in all major areas of business except electro-technology and telecommunications.
 
Concretely, the Spanish standard UNE 166002 (Gestión de la I+D+i) was published by AENOR in 2006 and bring a consistent set of guidelines on which the new European Standard has been built. Now AENOR is working in adapting (light changes) its UNE 166002 Standard to fully meet the UNE-CEN/TS 16555-1:2013. TISSAT is certified in UNE 166002 since 2010.
 
This Technical Specification provides guidance on establishing and maintaining an innovation management system (IMS). It is applicable to all public and private organisations regardless of sector, type or size. This document provides guidance on:
-          understanding the context of the organisation;
-          establishing the leadership and commitment of top management;
-          planning for innovation success;
-          identifying and fostering innovation enablers/driving factors;
-          developing the innovation management process;
-          evaluating and improving the performance of the innovation management system;
-          understanding and using innovation management techniques.

By using this document, organisations can increase their awareness of the value of an IMS, establish such a system, expand their capacity for innovation, and ultimately generate more value for the organisation and its interested parties. Since the innovation management system outlined in this document follows the PDCA structure (plan-do-check-act), so it can be integrated within other standardised business management systems existing in the organisations, e.g. EN ISO 9001, EN ISO 14001, etc. That’s the case of TISSAT, where these 3 Standards are fully integrated.
 
Finally, it should be noticed that the UNE-CEN/TS 16555-1:2013 is part of a family of Standards (on which the TC 389 is currently working on) as shown in the next table:
 
CEN/TS 16555   Standards Family
CEN/TS
Title
Status
Estimated Date
16555-1
Innovation   Management – Part 1: Innovation Management System
Published
2013-07
16555-2
Innovation   Management – Part 2: Strategic Intelligence Management.
Draft
2014-12
16555-3
Innovation   Management – Part 3: Innovation Thinking
Draft
2014-12
16555-4
Innovation   Management – Part 4: Intellectual Property Management
Draft
2014-12
16555-5
Innovation   Management – Part 5: Collaboration Management
Draft
2014-12
16555-6
Innovation   Management – Part 6: Creativity Management
Draft
2014-12
16555-7
Innovation   Management – Part 7: Innovation Management Assessment
Draft
2014-02

jueves, 11 de julio de 2013

TISSAT y GFI suman sus tecnologías para dar soluciones de virtualización en la nube

Tissat, empresa española especializada en el outsourcing de servicios de misión crítica y proveedora de servicios Cloud Computing, ha firmado un acuerdo con la empresa GFI Informática, empresa de Consultoría y Servicios informáticos con presencia internacional, para ofrecer una solución completa a sus clientes.
 
En virtud de este acuerdo, Tissat aportará sus soluciones Cloud en la nube, su infraestructura de referencia, Walhalla, el primer centro de datos español comercialmente disponible con certificación Tier IV y capacidad para ofrecer alojamiento a nivel internacional y su plataforma para dar soluciones cloud.
 
Por su parte, GFI Informática contribuye con su experiencia en desarrollo de software a medida para dotar a esta solución de la integración y mantenimiento requeridos para dar una solución personalizada a cada cliente.
 
En palabras de Carmen García, Directora de Tissat Madrid, “es momento de sumar para poder ofrecer a los clientes la mejor solución y servicio en un sector en el que existe una amplia oferta de tecnologías, muchas veces desconocidas por las empresas para optimizar su gestión y garantizar la máxima seguridad posible”.
 
“El acuerdo que hemos suscrito con Tissat  permitirá a  GFI proporcionar soluciones de Cloud Computing a nuestros clientes, principalmente en entornos que requieren de una certificación especial, en unas condiciones muy ventajosas y competitivas”, ha comentado Javier Uriarte, Director  de Desarrollo de Negocio de GFI.
 
A cerca de GFI Internacional
GFI Internacional es una empresa de Consultoría y Servicios Informáticos con presencia internacional en más de siete países y una fuerte implantación en Francia, donde se ha convertido en una de las marcas más conocidas en el sector de los servicios informáticos. Proporciona soluciones globales y un amplio abanico de servicios avalados por la certificación ISO 9001. www.gfi.es

miércoles, 3 de julio de 2013

Tissat Recertifies its Energy Efficiency and Design in its Data Centers and Reobtains the ISO 50001 Standard

Tissat, leader in the field of Data Centers and owner of Walhalla, the first Tier IV Data Center offering these services in Spain, becomes one of the only two companies in Spain that have obtained this quality certification and energy efficiency for its Data Centers.
 
This standard specifies requirements for establishing, implementing, maintaining and improving an energy management system, allowing Tissat to have a systematic approach and thus achieve continuous improvement in energy performance, including energy efficiency, energy security, energy use and consumption. This certificate helps Tissat continually reduce its use and its costs of energy and the emission of greenhouse gases.
 
The application of this standard for about two years as well as the monitoring of the best practices established by the “European Code of Conduct for Data Centers”, along with other measures that accompany and support this standard (ex. DCIM system) have allowed to the Tissat’s Data Center Walhalla reaching a PUE of 1.15, measured according to the strict and continuous level measurement of type 3 defined by ASHRAE, the Green Grid, the Uptime Institute and so on.
 
Note: This TISSAT certification have been recently improved by the AENOR’s EA0044:2013 Data Centres Energetic Sustainability Certification,  that  accredits that a Data Centre accomplishes with the established energy audits, as well as energy and carbon footprint management systems, contributing to mitigate climate change.
 
 
Requisitos establecidos y compromisos adquiridos ISO 50001

miércoles, 26 de junio de 2013

IT evolution boosted by the “Functional Diversity"

For everybody who understands Spanish, I recommend this 10 minute video that review the IT evolution to explain (with subtle humor sense, since the interviewer is Guillermo Fesser, a Spanish humorist, one of members of comic duo “goma-espuma”, i.e. “foam rubber” in English) how initially expensive IT devices developed for handicapped people now have become cheap and commonly used gadgets. At the end, the video claims for the “Functional Diversity” as a new positive concept, instead of disability or handicap, and a revitalizing incentive that boosts innovation.
 
Note: the term “divertad” is an invented word, resulting of contraction of two Spanish words: “diversidad” y “libertad”, used for Javier Romañach (the interviewed human rights activist, who also is an Internet pioneer in Spain) to say goodbye.
 
Hereafter, I quote Javier’s Word:


Un vídeo de 10 minutos para estudiantes de tecnología:
 
 Javier Romañach
Salud y divertad,
Javier Romañach

viernes, 21 de junio de 2013

Private Cloud vs. Public Cloud (one of the discussion panels at Forecast 2013)

A few day ago Bhargavi Srivathsan publishes a post in the Open Data Center Alliance website fostering an interesting discussion about comparison between Private Cloud and Public Cloud. This discussion was going to be analyzed in detail in ad-hoc panel in the Forecast 2013, on June 18th. Unfortunately I could meet that panel (I’m based in Spain), so maybe other ideas were added to the ones stated by Bhargavi in its post (in fact, the idea of Bhargavi was to warn and prepare the discussion in Forecast 2013).
 
I disagree with some of the point that Bhargavi said in its blog. So let me first copy the main Bhargavi’s paragraphs:
 
What is a Private Cloud? It is a complete cloud infrastructure that is dedicated to the needs of a single organization, typically implemented within the firewall of the organization employing internally available resources and controlled by the IT department. Now, how is this different from a public cloud? In a public cloud, the services are delivered over a network that is open to public access and data travels through an un-trusted network.
 
So is private cloud better than public cloud? It depends. There is definitely a choice to make. While private and public clouds share several architectural similarities, there are clear differences. A private cloud solution comes with the following advantages: 
  • Better control over data (and data security)
  • Ability to balance and provision resources with very little turnaround time
  • Better risk control 
Obviously, there are some disadvantages in choosing a private cloud. Private clouds generally more costs upfront – there are costs associated with maintenance, planning for redundancy to avoid data loss, involving additional personnel to handle all operations in-house.
 
 
So let me show my opinion about:
 
First, according to NIST Cloud definitions, Private Cloud could be operated for the same organization that uses it (as Bhargavi stated) but also for an external provider (e.g., we offer that services to our Customers).
 
Second, in order to be a real and authentic Cloud it should be mutitenancy (NIST definition again), that implies that the Private Cloud, although dedicated to the same company/organization, the infrastructures are shared between different department of that company: each one of them becomes a different tenant; evidently, usually, that implies big companies (see Note 1). So, in general, the “ability to balance and provision resources” is not a difference with the Public Cloud: this issue is roughly the same in both Public and “real” Private Clouds. However, I agree with the other 2 points a Private Cloud offer both better control over data (and data security) and better risk control that Public Cloud.
 
Finally, I also agree with Bhargavi that one disadvantage of the Private Cloud is the price: it’ll be more expensive than the Public Cloud either you own it, or you uses the services of a provider; part of the facts for that are the ones Bhargavi said, but for me there’s one more important (because, in some way, it is mostly the root cause of them): in spite of a real multitenant (departments of a large company) the usage percentage of infrastructures of Private Cloud will be, in general, lower that the Public Cloud usage percentage, so the Public Cloud prices can be cheaper because of the better usage of infrastructures, better usage of technicians operating the cloud, and so on.
  
Note (1): For not large enough companies the “Community Cloud” could be the answer if “private” approach is needed; in this model, according to NIST definition, the cloud infrastructure is provisioned for exclusive use by a specific community of consumers from organizations that have shared concerns, e.g., mission, security requirements, policy, and compliance considerations; besides, as in the case of private cloud, the cloud may be owned, managed, and operated by one or more of the organizations in the community, a third party, or some combination of them, and it may exist on or off premises).

jueves, 13 de junio de 2013

Reasons for Data Centres Energetic Sustainability Certification

As announced in my last post  today I resume the subject of Data Centres Energetic Sustainability Certification and the AENOR EA 0044.2013 Technical Specification, publishing an interview made to my coworker Nuria Lago, who lead our certification process, about the TISSAT’s reasons for and the results of this certification.
 
Note: once again I apologize for not translating this interview, but for people that don’t understand Spanish, on-line translators will do better than me.
 
 
¿Qué fue lo que motivó a Tissat a adoptar esta especificación técnica de sostenibilidad en los centros de datos?
En primer lugar, la diferenciación frente a otros posibles actores del sector. De esta manera, TISSAT ofrece a sus clientes mejores garantías y responde a los más altos niveles de exigencia en aspectos de valor para los mismos, en particular en el ámbito de la sostenibilidad. Y, en segundo lugar, afianzar la reputación e imagen de la compañía, asociando nuestra marca a los últimos y más exigentes estándares, en este caso de eficiencia y sostenibilidad.
 
 
¿Cómo fue el proceso de implantación? ¿Qué retos y fortalezas encontraron?
Para una empresa de Tecnologías de la Información, suponía un gran reto orientarse hacia la industrialización de procesos, pues no es nuestro core de base de negocio. Así, esta certificación constituye una gran oportunidad para ampliar la perspectiva de negocio y reenfocar la explotación de los servicios a través de la eficiencia de energía, y por ende, la reducción de costes. También nos ha permitido tener una percepción mejorada sobre la prestación de los servicios de extremo a extremo, ampliando la visión hacia la parte de energía.
 
 
¿Qué reducciones de consumos energéticos han registrado?
Estamos muy enfocados al retorno de la inversión, puesto que la sostenibilidad y eficiencia contribuyen a optimizar los costes energéticos de los Centros de Procesos de Datos, maximizando así el beneficio obtenido en la operación de los mismos. Se trata de una pieza clave, ya que nuestro consumo energético es extremadamente elevado debido a la infraestructura tan compleja con la que se opera (Sistema de Alimentación Ininterrumpida, iluminación, climatización, equipamiento TI, motores, etc.) por lo que cualquier ajuste, por pequeño que parezca, supone un gran ahorro a largo plazo. El sistema de Gestión de Energía y la Huella de Carbono permiten tener una visión clara sobre qué medimos, dónde estamos, pero sobre todo a dónde queremos llegar. Supone incorporar un sistema que te permite aprender, mejorar cada día, y optimizar consumos prácticamente mes a mes, sabiendo además que estás contribuyendo a reducir las emisiones de CO2.
 
 
¿Cuáles son a su juicio las ventajas más destacadas de esta certificación?
Tissat es una empresa de misión crítica con un gran volumen de servicios que se prestan a través de sus Centros de Procesos de Datos, por lo que es muy importante poder demostrar la excelencia de su operación. Además, Walhalla es un Datacenter de última generación, que representa un modelo de industrialización de CPD con diseño y desarrollo propio, preparados para albergar servicios de cloud computing y con capacidad para ofrecer alojamiento en el ámbito internacional.