Philadelphia Technology Spotlight

Podcast Series, Hosted by John Panzica

John Panzica PhillyStarting this week, vXchnge will be producing a monthly podcast series highlighting Technologists and Innovators in the Philadelphia community. The Philly Tech Spotlight Podcast will dive deep into companies, market trends, and the startup community to bring a spotlight to important trends and information that Philadelphia brings to the national and global stage.

Be sure to subscribe to this podcast series, hosted by John Panzica, Senior Vice President of Sales for vXchnge, to be alerted of future shows. John’s first guest will be Bob Moul, CEO of Cloudamize. The discussion will include Philly’s competitive advantages from a talent and resource perspective along with how the cloud infrastructure analytics and cost optimization company plans to invest the additional $1 million in funding it recently landed.

Those not living in Philly may be wonder , why a tech podcast focused on Philly? Those living in the Philadelphia community will tell you Philly has become a rich hub for higher education, healthcare, and tech startups to flourish. So much so that today the city’s on the verge of becoming a metropolitan tech leader as it’s now home to technologies such as mobility, streaming video, and mobile app content.

Philly is host to large global companies like SAP America, Vanguard Group, and the Comcast Corporation. The startup community has also created innovation and success with companies like Half.com (sold to eBay), Reality Online (acquired by Reuters), and recently Boomi (acquired by Dell). As Philadelphia continues to move into the technology spotlight, it is worth noting that historically Philadelphia had its times in the spotlight and has become a city of significance.

Whether you think back to when our Founding Fathers signed the Declaration of Independence in 1776 or reflect on the many universities and colleges that make it a top international study destination, you realize Philadelphia has been a pivotal part of American history. This tradition continues as the city now moves into the technology spotlight.

With Philadelphia being the 5th largest city in the United States, with an area population of 1.5 million, there is a growing demand for mobile devices, IP traffic, and content, which has left the technology industry scrambling to serve. Service demand for social media, streaming content, online discovery, and mobile purchasing, require companies in the content and infrastructure space to make considerable investments in Philadelphia. Such investments are a sure indication that Philadelphia technologists are now rethinking the long accepted strategy of servicing local businesses and consumers through points of presence in New York or Northern Virginia. Philadelphia networks, infrastructure, and content providers are now realizing the value of servicing the local demand through points of presence within the city. In short, they are servicing Philadelphia from Philadelphia.

Here at vXchnge, we have always invested in and believed that Philadelphia has been a key market in the infrastructure industry. When it had the opportunity to open its first Brownfield data center, vXchnge decided to invest in the city of Philadelphia again. vX’s investment of $30+ million has employed many local companies that, over the past 12 months, have done a spectacular job of building our data center.

This investment is just the start. As companies begin moving into our data center, vX has decided to invest again by creating this podcast series to highlight the technical and innovative achievements of this great city.

Next Steps:

Contact John at jpanzica@vxchnge.com or @jppanzica if you would like to be a guest or have topic suggestions.
Take a tour our Philadelphia Data Center.
Subscribe to the podcast to be alerted when the next episode is published.
Learn more about our colocation services, carrier connectivity, power & space infrastructure, and our Remote Hands service.

– See more at: http://www.vxchnge.com/blog/podcast-series-philly-tech-spotlight-hosted-john-panzica/#sthash.A3ErDjQC.dpuf

How do IoT and Big Data affect the Data Center?

dreamstime_xs_35464764

Getting Data Centers Ready for Big Data

Big Data has become a popular term for the vast amounts of data that is flowing into our data centers with increasing velocity and variety. The Internet of Things (IoT) constitutes a growing part of Big Data as it arrives in terabytes or exabytes (1,000,000 TB) per data set.

Since IoT and Big Data arrive at a higher velocity than traditional data, it is not only more difficult to analyze, but can also put additional stress on data centers. Where traditional data is highly structured, data from the IoT will not be nicely packaged and will likely be unstructured in nature.

How do IoT and Big Data affect the data center?

The research firm Gartner, Inc. believes that the IoT will include 26 billion units sending data to be processed by 2020. They also expect that the product and service supplier market for IoT will create revenue exceeding $300 billion in the same timeframe. According to Gartner, “The Internet of Things (IoT) has a potential transformational effect on the data center market, its customers, technology providers, technologies, and sales and marketing models.”

All of this data will need to be processed and analyzed which will increase the workload for data centers, forcing them to deal with new capacity, security, and analytics problems.

Joe Skorupa, vice president and distinguished analyst with Gartner said, “The enormous number of devices, coupled with the sheer volume, velocity and structure of IoT data, creates challenges, particularly in the areas of security, data, storage management, servers and the data center network, as real-time business processes are at stake. Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT.”

Conclusion

The Internet of Things is going to have an impact on storage management practices in data centers across the world. These data centers will need to radically increase infrastructure resources and storage capacity to be ready to handle the Internet of Things-related data.

Bandwidth will also be affected in both businesses and data centers. The Internet of Things is going to shift the trend away from old lower speed WAN links to higher-speed connections that are able to handle all of the small messages coming from the devices that comprise the IoT.

Data centers must start preparing for the additional storage, processing, and bandwidth required for Big Data and the Internet of Things. Of course, in order to handle this additional load, data centers will also face challenges with increased virtualization, which requires additional power and cooling to handle the additional load.

The Internet of Things (IoT): All Talk or Transforming Your Data Center?

nest

Ready for the next big tech revolution? Some experts argue for the Internet of Things (IoT) which is primed to storm the market by connecting devices of all shapes and sizes — from traditional server hardware to point-of-sale terminals and parking spaces. As noted by a Deutsche Bank study, however, hype currently outpaces revenue by a large margin; is IoT all talk, or on track to transform your data center?
Practically Speaking
When it comes to IoT, “stuff” might be a better descriptor, according to Intel’s Genevieve Bell. Speaking at MIT Technology Review’s Digital Summit, Bell argued that “things” implies connections of the same item over and over again, while an “Internet of Stuff” is more accurate: A collection of varied devices, from lawn mowers to lightswitches. In her view, reasons for using IoT vary significantly across corporations or municipalities, meaning there’s no “right” way to implement this technology, and no set rules about what gets connected.

Want a more practical example? Deutsche Bank details an IoT use case focusing on “smart retail.” It starts with IoT sensors at the physical store level, which monitor time-sensitive, transactional data such as inventory levels, product sales metrics, and customer volumes. And it goes beyond point-of-sale and stock room controls to include wireless parking sensors, motion detectors, and even environmental metrics. The goal is to provide a holistic revenue image; one that accounts for changes in weather, traffic patterns or varying product stock levels in real-time. The study sees wireline and mobile IoT end-points sending data through an edge cloud, into the data center, then distributed among data analytics, ERP, and IoT control apps.

No Small Feat
Right now, big data focuses on information generated from standard collection systems, but imagine the flow-rate change if every business unit had access to cost-effective wireless, mobile sensors. “Big data” doesn’t do it justice; research firm Gartner predicts that by 2020, the IoT will connect 26 billion devices, all transmitting a continuous stream of information.

The result? Transformation is coming as IoT-enabled devices begin penetrating the market. Companies looking to future-proof their data centers have two options: Managed hosting or colocation. Managed hosting shifts data center responsibility to a provider and utilizes that provider’s hardware to store, analyze and manipulate data. Colocation, meanwhile, keeps hardware ownership with your company but contracts out physical cabinet storage, power, cooling, and support. With IoT technologies set to create diffuse networks of heterogeneous devices, increased cabinet density through intelligent power management can help maximize server efficacy.

IoT won’t transform your data center tomorrow, but in five years companies that haven’t future-proofed their cabinets will find it difficult to generate ROI from this nebulous network of devices. Now is the time to consider server best practices and potential — when value catches up with vision, you’ll be ready.

– See more at: http://www.vxchnge.com/blog/internet-things-iot-talk-transforming-data-center/#sthash.sJ2hGncX.dpuf

When it comes to IoT, “stuff” might be a better descriptor, according to Intel’s Genevieve Bell. Speaking at MIT Technology Review’s Digital Summit, Bell argued that “things” implies connections of the same item over and over again, while an “Internet of Stuff” is more accurate: A collection of varied devices, from lawn mowers to light switches. In her view, reasons for using IoT vary significantly across corporations or municipalities, meaning there’s no “right” way to implement this technology, and no set rules about what gets connected.

Want a more practical example? Deutsche Bank details an IoT use case focusing on “smart retail.” It starts with IoT sensors at the physical store level, which monitor time-sensitive, transactional data such as inventory levels, product sales metrics, and customer volumes. And it goes beyond point-of-sale and stock room controls to include wireless parking sensors, motion detectors, and even environmental metrics. The goal is to provide a holistic revenue image; one that accounts for changes in weather, traffic patterns or varying product stock levels in real-time. The study sees wireline and mobile IoT end-points sending data through an edge cloud, into the data center, then distributed among data analytics, ERP, and IoT control apps.

 

No Small Feat

Right now, big data focuses on information generated from standard collection systems, but imagine the flow-rate change if every business unit had access to cost-effective wireless, mobile sensors. “Big data” doesn’t do it justice; research firm Gartner predicts that by 2020, the IoT will connect 26 billion devices, all transmitting a continuous stream of information .

The result? Transformation is coming as IoT-enabled devices begin penetrating the market. Companies looking to future-proof their data centers have two options: Managed hosting or colocation. Managed hosting shifts data center responsibility to a provider and utilizes that provider’s hardware to store, analyze and manipulate data. Colocation, meanwhile, keeps hardware ownership with your company but contracts out physical cabinet storage, power, cooling, and support. With IoT technologies set to create diffuse networks of heterogeneous devices, increased cabinet density through intelligent power management can help maximize server efficacy.

IoT won’t transform your data center tomorrow, but in five years companies that haven’t future-proofed their cabinets will find it difficult to generate ROI from this nebulous network of devices. Now is the time to consider server best practices and potential — when value catches up with vision, you’ll be ready.

– See more at: http://www.vxchnge.com/blog/internet-things-iot-talk-transforming-data-center/#sthash.sJ2hGncX.dpuf

SAP HANA Doesn’t Have to be Hard

Are you considering SAP HANA, but you’re not sure where to start?

SAP HANA is a relational database that runs in-memory for fast access, allowing business analysts to query large quantities of data without having to wait for the database to read and write data. This speed allows users to explore and analyze both analytical and transactional data from almost any data source.

However, users of SAP HANA face some common challenges:

1. Slow deployments can be costly

One common complaint with SAP HANA is that slow implementations can be costly. Rapid deployments can increase ROI. Most organizations can expect a cloud-based SAP HANA implementation in less than 30 days.

Using SAP HANA in the cloud allows businesses to shift from a large upfront capital expense to a more manageable monthly operational expense. This reduces the infrastructure total cost of ownership an average of 28% or more within a three-year period.

2. Latency and speed issues

There are speed benefits to being closer to the HANA production lines. Close geographic proximity puts your data nanoseconds away from the SAP HANA cloud engine. This is what happens when your application exists in the same data center as the HANA cloud. This type of speed is possible when you have a physical data center infrastructure that allows you to incubate, test, and grow IT environments that have a nanosecond access to HANA cloud.

Faster access improves flexibility because it allows you to grow an SAP HANA database, add new apps, and expand your geographic footprint giving you the lowest cost, greater speed, and least disruption. Having HANA in the cloud allows the service provider to focus on delivering an SAP HANA solution so that your IT department doesn’t have to accommodate a new platform and manage yet another system.

3. Security concerns

While many data centers offer five levels of security, there are advantages to being in the same data center as the SAP HANA cloud. One of them is that you have the benefit of not touching any public IP addresses, and another is that you have a point-to-point cross connect directly to the HANA cloud. This eliminates information exchange over public networks and allows a more secure point-to-point connection.

4. It’s a complex implementation

It’s unfortunate that SAP HANA has a reputation of being very complex and hard to deploy, especially with an on premise solution. By moving an on premise solution to the cloud, it allows your service provider to take care of complex configurations for you. While some businesses may be hampered by a complex on premise solution, having the right provider will make it easy to get customizations up and running quickly and easily. This allows the service provider to manage and optimize your SAP HANA implementation, so that your IT department doesn’t have to.

Optimization is critical for any system. Businesses have learned that optimization is not a once and done process. It is an ongoing process that requires a business to revisit and optimize on an ongoing basis. By having a strategic partner that manages your SAP HANA service, they can continue to help you optimize as they have learned some of the latest and greatest best practices from the industry.

5. No current analytics solution

What if your company doesn’t have an analytics tool; can you still get the benefits from all of those legacy systems that have disparate data that need to be combined into one dataset? Using SAP HANA, you can handle these large loads of data from legacy systems and tie them all together. This is one of the true business benefits of analytics.

Conclusion

SAP HANA doesn’t have to be hard. To be successful, you need a data center with flexible, scalable deployment that is close to the SAP HANA cloud. They will also need the available space and power to handle high-density equipment and still have interconnects to SAP HANA and other carriers.

Combining the right data center with an SAP HANA partner can give you access to the industry experts, speed, and security that your company needs to succeed.