Big Data Team Develops a Technology Strategy
Big Data Business Strategy won’t magically result in a business value without a robust Technology Strategy and Platform. ComTec Big Data team develops a technology strategy and roadmap to support the Big Data Business Strategy.
ComTec Big Data team develops a vendor-agnostic technology strategy and roadmap that includes both infrastructure and software platforms to store big data and get insights from it through reporting and analytics.
Our team assesses the current IT infrastructure and defines the capabilities necessary for the big data initiative. We develop an integrated technology strategy that supports the collection, consolidation, management and security of data and that deals efficiently with all types of data structured, semi-structured and unstructured data. The architecture needs to be able to handle both the volume and velocity required to process the data. Typical Big Data volumes reach multiple terabytes (TB), petabytes (PB), or even larger datasets.
ComTec Big Data team will consider the existing IT infrastructure and scale-out, scale-up, and virtualization approaches in designing the technology strategy.
Different scenarios require different big data technologies. The exact combination differs between organizations, depending on requirements as well as existing environments.
Scale-out Scale-out technologies, such as Hadoop, are designed to distribute the compute power across numerous servers, each typically having local storage. Increasing capacity is as simple as adding more nodes to the environment. The use of local storage means that each server typically processes data locally, versus accessing data on attachedshared storage. Also, local storage is less expensive than shared storage technologies.
Scale-up Converged systems like SAP HANA and Oracle Exadata can handle both transaction processing as well as real-time analytics. Converged infrastructure is the stateless, preconfigured, and pre-optimized and often comes preloaded with the apps. Converged infrastructure eliminates the busywork of provisioning servers, SAN, LAN and applications, which takes multiple teams of people, and replaces it with plug-and-play options that come ready to go (more or less) out of the box.
Virtualization Virtualizing servers is typically done for portability and ease of deployment. Because Big Data technologies make heavy use of server processing and disk I/O, virtualization is generally not seen as beneficial to Big Data technologies. However, test systems can be implemented on virtual machines.