Cloud vs. On-Premise
Understanding costs differences between Identity as a Service (IDaaS) and On Premise Deployments

Cost Comparison – Cloud vs On Premise
When comparing the cost of doing an on premise to our Cloud Identity as a Service offering, your organization will realize a substantial amount of time and money can be saved. Receive a free 3 year cost comparison for cloud vs on premise by contacting us now.
Other Factors To Consider
Lost Revenue Due To Downtime
Information Week shed light on a 2011 study done by CA Technologies which attempted to provide an estimate of what downtime costs businesses on a broad scale. Of 200 surveyed businesses across the USA and Europe, they found that a total of $26.5 Billion USD is lost each year due to IT downtime. That’s an average of about $55,000 in lost revenue for smaller enterprises, $91,000 for midsize organizations, and over $1 million+ for large companies. You can see how important uptime is when it comes to production level systems, and why considering downtime costs is a hidden factor which shouldn’t be skimmed over.
Providing A 24 x 7 x 365 Fully Geo Redundant Service
Optimal IdM’s cloud service is a fully managed 24 x 7 x 365 service that has guaranteed Service Level Agreements (SLA’s). Each day millions of users depend upon Optimal IdM’s solutions for the authentication and security needs. This is our core business and competency and we excel at providing the best service at the best possible price.
Cost Of Identity And Access Management As A Percentage Of Total IT budget
This article explains that “Within the IT security community, identity- and access-management (IAM) initiatives are considered high value, but are notoriously problematic to deploy. Yet despite IAM’s complexity, it represents 30 percent or more of the total information security budget of most large institutions, according to IDC (a sister company to CSO’s publisher).” With the OptimalCloud, the costs are significantly lower with a quicker adoption time as well.
Energy Costs Estimated From This Article
According to this reporter’s numbers, which use an average kWh cost for energy from the US Energy Information Administration as of January 2013, she figures that an average in-house server in the USA (accounting for both direct IT power and cooling) sucks up about $731.94 per year in electricity.
OptimalCloud Deployment Scenarios
Watch this video to learn more about several deployment scenarios for The OptimalCloud.

