May 19, 2020

Private Clouds Rain Down Big Benefits

Cloud Computing
Andrey Dmitriev
Bizclik Editor
5 min
Private Clouds Rain Down Big Benefits

The July edition of The Business Review USA is now live!

By: Andrey Dmitriev

Cloud environments have taken the business landscape by storm by offering an easy combination of the most advanced technologies available in terms of cost, performance and functionality. As a result, they deliver big rewards by providing high availability, redundancy, reduced cost and the flexibility that businesses need. Just a few years ago, network performance was frustratingly inconsistent, compute resources were relatively expensive, and requisite capabilities such as encryption or security were separate parts of the stack to be integrated and managed. That’s all changed. 

Significant advancements in networking, compute capacity, storage, databases and security, all wrapped in virtualization, have made cloud possible. Forrester Research reported that 46% of respondents in a recent study indicated that they will be implementing a private cloud this year. It’s no longer a question of whether or not to deploy cloud, it’s a question of how.

Cloud solutions are tightly integrated packages of servers, networking and storage “glued together” by virtualization. Clouds can be deployed and managed in a few ways depending on where the hardware/software is hosted and who plans, deploys and manages the environments. 

Public cloud – Amazon’s AWS or GoGrid are examples of public clouds, where the companies provide “compute capacity for rent” and manage the infrastructure environment (yet not the business applications) for customers.  Despite the benefits of public clouds, there have been some high-profile outages and security concerns that leave some feeling uneasy about the public cloud. Thankfully, there’s another option: private cloud. 

Private cloud – Private clouds can be deployed in the customer’s data center or even a trusted partner’s data center, and can be managed by the company’s IT team or a trusted partner’s operations team. Either way, private cloud is an ensemble of technology that provides the benefits of cloud, but since the entire environment is consumed by one organization, it reduces the security and resource allocation concerns of sharing the entire infrastructure with the masses. However, many companies will use their private cloud to support a few enterprise applications, taking advantage of the shared compute resources. It’s common to have the company’s e-mail and collaboration applications running alongside a web store or other application in a private cloud.

Private clouds produce many operational benefits, including:

  • On-demand self-service, which allows teams to use the cloud as a “development sandbox”
  • Resource pooling among applications
  • Elasticity; rapid expansion of capacity
  • Ability to measure service levels and resource consumption
  • When deployed in a third party data center and managed by the third party, additional economic benefits are likely

The cost/benefit analysis of public cloud compared to private cloud depends on many factors, including the length of time for which the cloud capacity is needed, and how reliably the operations team can forecast the required capacity. If the duration is long and the capacity can be reasonably and reliably measured, private cloud offers a strong ROI. This is partially due to the marked improvements in performance and functionality that make five-year hardware replacement scheduled perfectly acceptable, whereas best practices just a few years ago were to replace hardware after just three years. Further, outsourcing the deployment and management of the private cloud has additional benefits when one considers the cost of hiring IT professionals who understand all of the underlying technologies and applications, and how they fit together. 

Increasingly dynamic cloud deployments require infrastructure integration that needs a high degree of expertise, and stack familiarity that many enterprises don’t have, especially in smaller IT organizations. Even large organizations that have been successful in the past deploying “do-it-yourself” builds, have reported challenges in mastering the technology selection, capacity planning, system monitoring and tuning, integration and support of their private clouds. 

The private cloud environment is further complicated by the potential interaction among the applications that share it. Careful planning and an understanding of the “quality of services” features must precede any deployment to assure that one application does not negatively impact the performance of another. 

There are a range of approaches to architecting the private cloud environment that are more or less customized to meet users’ needs. Some vendors have rolled out templates that are preconfigured systems of components which are packaged, priced, and delivered in a simplified manner to address common business use cases. They take much of the work out of planning and integrating the components, yet they may not match clients’ specific needs and often require customization to match organization’s particular requirements. Templates could be a good starting point for some companies migrating to the cloud, yet combining them can get complicated. 

IT organizations must ensure that new implementations assimilate with existing infrastructure, and support current and future initiatives. Does the template support the specific applications and versions that we have or need? Is the template built for a specific use case? 

Private cloud initiatives deliver much needed performance and scalability while providing savings (even over public clouds), and much higher security and lower risk of performance issues. The recommended approach is to understand capacity needs and trends, and identify potential private cloud-based solutions prior to making the purchase in order to ensure right-sizing the private cloud while protecting the investment. Continuous monitoring of performance profiles ensures that today’s needs are met and the infrastructure can scale for tomorrow’s growth. 

About Mentora:

Mentora is a provider of managed hosting, managed services and application performance testing. Since 2001, Mentora has provided high-touch hosting services for businesses, including some of the fastest-growing and dynamic online retail sites in the country, B2B and SaaS. All Mentora hosting support is delivered by Level-III engineers, and the company is a well-recognized provider of application and infrastructure performance testing services. Mentora specializes in testing E-commerce, web, Oracle E-Business Suite, healthcare applications and Citrix and Webshpere MQ technologies, and offers dedicated Private Cloud and Hybrid Cloud solutions with provisioned and/or customer-owned equipment. For more information, visit

Share article

Jun 18, 2021

Intelliwave SiteSense boosts APTIM material tracking

3 min
Intelliwave Technologies outlines how it provides data and visibility benefits for APTIM

“We’ve been engaged with the APTIM team since early 2019 providing SiteSense, our mobile construction SaaS solution, for their maintenance and construction projects, allowing them to track materials and equipment, and manage inventory.

We have been working with the APTIM team to standardize material tracking processes and procedures, ultimately with the goal of reducing the amount of time  spent looking for materials. Industry studies show that better management of materials can lead to a 16% increase in craft labour productivity.

Everyone knows construction is one of the oldest industries but it’s one of the least tech driven comparatively. About 95% of Engineering and Construction data captured goes unused, 13% of working hours are spent looking for data and around 30% of companies have applications that don’t integrate. 

With APTIM, we’re looking at early risk detection, through predictive analysis and forecasting of material constraints, integrating with the ecosystem of software platforms and reporting on real-time data with a ‘field-first’ focus – through initiatives like the Digital Foreman. The APTIM team has seen great wins in the field, utilising bar-code technology, to check in thousands of material items quickly compared to manual methods.

There are three key areas when it comes to successful Materials Management in the software sector – culture, technology, and vendor engagement.

Given the state of world affairs, access to data needs to be off site via the cloud to support remote working conditions, providing a ‘single source of truth’ accessed by many parties; the tech sector is always growing, so companies need faster and more reliable access to this cloud data; digital supply chain initiatives engage vendors a lot earlier in the process to drive collaboration and to engage with their clients, which gives more assurance as there is more emphasis on automating data capture. 

It’s been a challenging period with the pandemic, particularly for the supply chain. Look what happened in the Suez Canal – things can suddenly impact material costs and availability, and you really have to be more efficient to survive and succeed. Virtual system access can solve some issues and you need to look at data access in a wider net.

Solving problems comes down to better visibility, and proactively solving issues with vendors and enabling construction teams to execute their work. The biggest cause of delays is not being able to provide teams with what they need.

On average 2% of materials are lost or re-ordered, which only factors in the material cost, what is not captured is the duplicated effort of procurement, vendor and shipping costs, all of which have an environmental impact.

As things start to stabilise, APTIM continues to utilize SiteSense to boost efficiencies and solve productivity issues proactively. Integrating with 3D/4D modelling is just the precipice of what we can do. Access to data can help you firm up bids to win work, to make better cost estimates, and AI and ML are the next phase, providing an eco-system of tools.

A key focus for Intelliwave and APTIM is to increase the availability of data, whether it’s creating a data warehouse for visualisations or increasing integrations to provide additional value. We want to move to a more of an enterprise usage phase – up to now it’s been project based – so more people can access data in real time.


Share article