Philadelphia Technology Spotlight

Podcast Series, Hosted by John Panzica

John Panzica PhillyStarting this week, vXchnge will be producing a monthly podcast series highlighting Technologists and Innovators in the Philadelphia community. The Philly Tech Spotlight Podcast will dive deep into companies, market trends, and the startup community to bring a spotlight to important trends and information that Philadelphia brings to the national and global stage.

Be sure to subscribe to this podcast series, hosted by John Panzica, Senior Vice President of Sales for vXchnge, to be alerted of future shows. John’s first guest will be Bob Moul, CEO of Cloudamize. The discussion will include Philly’s competitive advantages from a talent and resource perspective along with how the cloud infrastructure analytics and cost optimization company plans to invest the additional $1 million in funding it recently landed.

Those not living in Philly may be wonder , why a tech podcast focused on Philly? Those living in the Philadelphia community will tell you Philly has become a rich hub for higher education, healthcare, and tech startups to flourish. So much so that today the city’s on the verge of becoming a metropolitan tech leader as it’s now home to technologies such as mobility, streaming video, and mobile app content.

Philly is host to large global companies like SAP America, Vanguard Group, and the Comcast Corporation. The startup community has also created innovation and success with companies like Half.com (sold to eBay), Reality Online (acquired by Reuters), and recently Boomi (acquired by Dell). As Philadelphia continues to move into the technology spotlight, it is worth noting that historically Philadelphia had its times in the spotlight and has become a city of significance.

Whether you think back to when our Founding Fathers signed the Declaration of Independence in 1776 or reflect on the many universities and colleges that make it a top international study destination, you realize Philadelphia has been a pivotal part of American history. This tradition continues as the city now moves into the technology spotlight.

With Philadelphia being the 5th largest city in the United States, with an area population of 1.5 million, there is a growing demand for mobile devices, IP traffic, and content, which has left the technology industry scrambling to serve. Service demand for social media, streaming content, online discovery, and mobile purchasing, require companies in the content and infrastructure space to make considerable investments in Philadelphia. Such investments are a sure indication that Philadelphia technologists are now rethinking the long accepted strategy of servicing local businesses and consumers through points of presence in New York or Northern Virginia. Philadelphia networks, infrastructure, and content providers are now realizing the value of servicing the local demand through points of presence within the city. In short, they are servicing Philadelphia from Philadelphia.

Here at vXchnge, we have always invested in and believed that Philadelphia has been a key market in the infrastructure industry. When it had the opportunity to open its first Brownfield data center, vXchnge decided to invest in the city of Philadelphia again. vX’s investment of $30+ million has employed many local companies that, over the past 12 months, have done a spectacular job of building our data center.

This investment is just the start. As companies begin moving into our data center, vX has decided to invest again by creating this podcast series to highlight the technical and innovative achievements of this great city.

Next Steps:

Contact John at jpanzica@vxchnge.com or @jppanzica if you would like to be a guest or have topic suggestions.
Take a tour our Philadelphia Data Center.
Subscribe to the podcast to be alerted when the next episode is published.
Learn more about our colocation services, carrier connectivity, power & space infrastructure, and our Remote Hands service.

– See more at: http://www.vxchnge.com/blog/podcast-series-philly-tech-spotlight-hosted-john-panzica/#sthash.A3ErDjQC.dpuf

How do IoT and Big Data affect the Data Center?

dreamstime_xs_35464764

Getting Data Centers Ready for Big Data

Big Data has become a popular term for the vast amounts of data that is flowing into our data centers with increasing velocity and variety. The Internet of Things (IoT) constitutes a growing part of Big Data as it arrives in terabytes or exabytes (1,000,000 TB) per data set.

Since IoT and Big Data arrive at a higher velocity than traditional data, it is not only more difficult to analyze, but can also put additional stress on data centers. Where traditional data is highly structured, data from the IoT will not be nicely packaged and will likely be unstructured in nature.

How do IoT and Big Data affect the data center?

The research firm Gartner, Inc. believes that the IoT will include 26 billion units sending data to be processed by 2020. They also expect that the product and service supplier market for IoT will create revenue exceeding $300 billion in the same timeframe. According to Gartner, “The Internet of Things (IoT) has a potential transformational effect on the data center market, its customers, technology providers, technologies, and sales and marketing models.”

All of this data will need to be processed and analyzed which will increase the workload for data centers, forcing them to deal with new capacity, security, and analytics problems.

Joe Skorupa, vice president and distinguished analyst with Gartner said, “The enormous number of devices, coupled with the sheer volume, velocity and structure of IoT data, creates challenges, particularly in the areas of security, data, storage management, servers and the data center network, as real-time business processes are at stake. Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT.”

Conclusion

The Internet of Things is going to have an impact on storage management practices in data centers across the world. These data centers will need to radically increase infrastructure resources and storage capacity to be ready to handle the Internet of Things-related data.

Bandwidth will also be affected in both businesses and data centers. The Internet of Things is going to shift the trend away from old lower speed WAN links to higher-speed connections that are able to handle all of the small messages coming from the devices that comprise the IoT.

Data centers must start preparing for the additional storage, processing, and bandwidth required for Big Data and the Internet of Things. Of course, in order to handle this additional load, data centers will also face challenges with increased virtualization, which requires additional power and cooling to handle the additional load.

The Internet of Things (IoT): All Talk or Transforming Your Data Center?

nest

Ready for the next big tech revolution? Some experts argue for the Internet of Things (IoT) which is primed to storm the market by connecting devices of all shapes and sizes — from traditional server hardware to point-of-sale terminals and parking spaces. As noted by a Deutsche Bank study, however, hype currently outpaces revenue by a large margin; is IoT all talk, or on track to transform your data center?
Practically Speaking
When it comes to IoT, “stuff” might be a better descriptor, according to Intel’s Genevieve Bell. Speaking at MIT Technology Review’s Digital Summit, Bell argued that “things” implies connections of the same item over and over again, while an “Internet of Stuff” is more accurate: A collection of varied devices, from lawn mowers to lightswitches. In her view, reasons for using IoT vary significantly across corporations or municipalities, meaning there’s no “right” way to implement this technology, and no set rules about what gets connected.

Want a more practical example? Deutsche Bank details an IoT use case focusing on “smart retail.” It starts with IoT sensors at the physical store level, which monitor time-sensitive, transactional data such as inventory levels, product sales metrics, and customer volumes. And it goes beyond point-of-sale and stock room controls to include wireless parking sensors, motion detectors, and even environmental metrics. The goal is to provide a holistic revenue image; one that accounts for changes in weather, traffic patterns or varying product stock levels in real-time. The study sees wireline and mobile IoT end-points sending data through an edge cloud, into the data center, then distributed among data analytics, ERP, and IoT control apps.

No Small Feat
Right now, big data focuses on information generated from standard collection systems, but imagine the flow-rate change if every business unit had access to cost-effective wireless, mobile sensors. “Big data” doesn’t do it justice; research firm Gartner predicts that by 2020, the IoT will connect 26 billion devices, all transmitting a continuous stream of information.

The result? Transformation is coming as IoT-enabled devices begin penetrating the market. Companies looking to future-proof their data centers have two options: Managed hosting or colocation. Managed hosting shifts data center responsibility to a provider and utilizes that provider’s hardware to store, analyze and manipulate data. Colocation, meanwhile, keeps hardware ownership with your company but contracts out physical cabinet storage, power, cooling, and support. With IoT technologies set to create diffuse networks of heterogeneous devices, increased cabinet density through intelligent power management can help maximize server efficacy.

IoT won’t transform your data center tomorrow, but in five years companies that haven’t future-proofed their cabinets will find it difficult to generate ROI from this nebulous network of devices. Now is the time to consider server best practices and potential — when value catches up with vision, you’ll be ready.

– See more at: http://www.vxchnge.com/blog/internet-things-iot-talk-transforming-data-center/#sthash.sJ2hGncX.dpuf

When it comes to IoT, “stuff” might be a better descriptor, according to Intel’s Genevieve Bell. Speaking at MIT Technology Review’s Digital Summit, Bell argued that “things” implies connections of the same item over and over again, while an “Internet of Stuff” is more accurate: A collection of varied devices, from lawn mowers to light switches. In her view, reasons for using IoT vary significantly across corporations or municipalities, meaning there’s no “right” way to implement this technology, and no set rules about what gets connected.

Want a more practical example? Deutsche Bank details an IoT use case focusing on “smart retail.” It starts with IoT sensors at the physical store level, which monitor time-sensitive, transactional data such as inventory levels, product sales metrics, and customer volumes. And it goes beyond point-of-sale and stock room controls to include wireless parking sensors, motion detectors, and even environmental metrics. The goal is to provide a holistic revenue image; one that accounts for changes in weather, traffic patterns or varying product stock levels in real-time. The study sees wireline and mobile IoT end-points sending data through an edge cloud, into the data center, then distributed among data analytics, ERP, and IoT control apps.

 

No Small Feat

Right now, big data focuses on information generated from standard collection systems, but imagine the flow-rate change if every business unit had access to cost-effective wireless, mobile sensors. “Big data” doesn’t do it justice; research firm Gartner predicts that by 2020, the IoT will connect 26 billion devices, all transmitting a continuous stream of information .

The result? Transformation is coming as IoT-enabled devices begin penetrating the market. Companies looking to future-proof their data centers have two options: Managed hosting or colocation. Managed hosting shifts data center responsibility to a provider and utilizes that provider’s hardware to store, analyze and manipulate data. Colocation, meanwhile, keeps hardware ownership with your company but contracts out physical cabinet storage, power, cooling, and support. With IoT technologies set to create diffuse networks of heterogeneous devices, increased cabinet density through intelligent power management can help maximize server efficacy.

IoT won’t transform your data center tomorrow, but in five years companies that haven’t future-proofed their cabinets will find it difficult to generate ROI from this nebulous network of devices. Now is the time to consider server best practices and potential — when value catches up with vision, you’ll be ready.

– See more at: http://www.vxchnge.com/blog/internet-things-iot-talk-transforming-data-center/#sthash.sJ2hGncX.dpuf

SAP HANA Doesn’t Have to be Hard

Are you considering SAP HANA, but you’re not sure where to start?

SAP HANA is a relational database that runs in-memory for fast access, allowing business analysts to query large quantities of data without having to wait for the database to read and write data. This speed allows users to explore and analyze both analytical and transactional data from almost any data source.

However, users of SAP HANA face some common challenges:

1. Slow deployments can be costly

One common complaint with SAP HANA is that slow implementations can be costly. Rapid deployments can increase ROI. Most organizations can expect a cloud-based SAP HANA implementation in less than 30 days.

Using SAP HANA in the cloud allows businesses to shift from a large upfront capital expense to a more manageable monthly operational expense. This reduces the infrastructure total cost of ownership an average of 28% or more within a three-year period.

2. Latency and speed issues

There are speed benefits to being closer to the HANA production lines. Close geographic proximity puts your data nanoseconds away from the SAP HANA cloud engine. This is what happens when your application exists in the same data center as the HANA cloud. This type of speed is possible when you have a physical data center infrastructure that allows you to incubate, test, and grow IT environments that have a nanosecond access to HANA cloud.

Faster access improves flexibility because it allows you to grow an SAP HANA database, add new apps, and expand your geographic footprint giving you the lowest cost, greater speed, and least disruption. Having HANA in the cloud allows the service provider to focus on delivering an SAP HANA solution so that your IT department doesn’t have to accommodate a new platform and manage yet another system.

3. Security concerns

While many data centers offer five levels of security, there are advantages to being in the same data center as the SAP HANA cloud. One of them is that you have the benefit of not touching any public IP addresses, and another is that you have a point-to-point cross connect directly to the HANA cloud. This eliminates information exchange over public networks and allows a more secure point-to-point connection.

4. It’s a complex implementation

It’s unfortunate that SAP HANA has a reputation of being very complex and hard to deploy, especially with an on premise solution. By moving an on premise solution to the cloud, it allows your service provider to take care of complex configurations for you. While some businesses may be hampered by a complex on premise solution, having the right provider will make it easy to get customizations up and running quickly and easily. This allows the service provider to manage and optimize your SAP HANA implementation, so that your IT department doesn’t have to.

Optimization is critical for any system. Businesses have learned that optimization is not a once and done process. It is an ongoing process that requires a business to revisit and optimize on an ongoing basis. By having a strategic partner that manages your SAP HANA service, they can continue to help you optimize as they have learned some of the latest and greatest best practices from the industry.

5. No current analytics solution

What if your company doesn’t have an analytics tool; can you still get the benefits from all of those legacy systems that have disparate data that need to be combined into one dataset? Using SAP HANA, you can handle these large loads of data from legacy systems and tie them all together. This is one of the true business benefits of analytics.

Conclusion

SAP HANA doesn’t have to be hard. To be successful, you need a data center with flexible, scalable deployment that is close to the SAP HANA cloud. They will also need the available space and power to handle high-density equipment and still have interconnects to SAP HANA and other carriers.

Combining the right data center with an SAP HANA partner can give you access to the industry experts, speed, and security that your company needs to succeed.

Cyber Security Breaches A Daily Occurrence—Two Companies That Can Help Protect Your Data!

Cyber-attacks seem to be picking up in frequency. In the mainstream news, attacks are reported on what seems to be a weekly basis. This subject is not something anyone gives too much thought to, until you start seeing big brand names getting hacked. Then the collateral damage becomes obvious; individual stories of ordinary people hit hard, having their credit card information used by fraudulent parties.

A few weeks ago, there was an interesting interview on 60 Minutes featuring the head of the FBI, James Coomey, who has been tenured at the Federal Bureau of Investigations for a year. His thoughts on cyber security and the threats faced by individuals and U.S. companies were eye-opening; he stated that too many attacks to count occur every day. He described the Internet as the connection to an individual’s life; between pictures of family, financial information, friendships and hobbies, once a hacker is able to crack the front door of a computer they have access to an individual’s life, in its entirety. He equated this to a dangerous parking lot where nation states, organized hackers, terrorists and criminals prey on the public. Before the advent of the Internet, an individual’s most personal information was never so easily accessible in a single place, nor so vulnerable to threats originating on the other side of the world.

With the subject so prominent, and, in some cases, fear spreading through individuals, companies and communities, a bit of research was needed in order to focus on this growing trend, and which companies could possibly help protect personal or corporate data. In an independent study conducted by the Ponemon Institute (an independent research facility specializing in information security policy, privacy and data protection) in June 2014, detailing the number of companies who have become prey to cyber security breaches and that 62% of the respondents did not trust the security of their cloud infrastructure. Their data indicates that number is expected to keep climbing every year. Indeed, the figure the Ponemon Institute published represents a rise of 10% from the previous year.

Companies who have experienced security breaches include some of the largest businesses in the United States, such as:

  • JPMorgan – The attack perpetrated against JPMorgan Chase & Co. was first made public in July. This attack compromised small business and regular household information; roughly 76 million small businesses were affected, while more than ten times that number of regular households had their information compromised. While no money was apparently stolen, internal information and personal information (not including social security or account numbers, birth dates or passwords) held by JPMorgan on its customers was exposed. There is speculation that this attack was a precursor to a larger future action.
  • Home Depot – A breach of information was verified by Home Depot on September 8. It caused 56 million cardholder accounts to be put at risk and affected anyone who may have made a payment at any Home Depot from April through September. In response, Home Depot implemented a very large payment security project to offer their customers better security through more advanced data encryption.
  • Target – The security breach suffered by Target during the 2013 holiday season was revealed by Target in January but had already been reported in mid-December by an independent cyber-security expert. Information for 70 million customers was breached, including name, address, email and phone numbers. Another 40 million suffered loss of data for debit and credit cards used at the retailer before and during the 2013 holiday season.

Predictably, sales have fallen following revelations of security breaches. Stock prices can also be tipped into responding unfavorably, leading to huge losses and other detrimental impacts felt for several quarters following the breach. Target’s sales have only recently begun to recover; after the breach was revealed, sales for the chain slid 2%-6% from the prior year.

Growing Cause for Concern

Breaches are increasing in both scale and frequency. Cyber-attacks are now capable of affecting anyone who uses debit or credit cards, or holds an account with any business that requires name, address, phone, email or other identifying information. They’re not confined to the US, either.

During the early part of 2014, a data breach occurred, which affected over 70% of South Koreans between the ages of 15 and 65. Their personal data and credit card information was compromised and stolen in a breach which affected more than 25 million South Koreans. The Korea Credit Bureau worked with Korean credit card companies to provide them with credit scores for Korean consumers; a worker there caused the breach.

While the numbers have improved, more than a quarter of companies still don’t have a team in place for responding to data breaches, nor do they have a plan for such a situation. Last year, nearly four in ten companies were lacking such precautions. Even when prepared for attacks, only 30% believed their company was adequately prepared or well-equipped to handle a cyber-attack. Many companies with plans in place have not reviewed them on a quarterly basis, and most have not been updated or reviewed since they were implemented.

As an individual trying to protect personal computers and information, or a corporation trying to protect computers and information at work, the advice has long been that the following should be put into practice:

  • Don’t open attachments from people you don’t know.
  • Don’t open attachments from people you know, but from whom you are not expecting to receive a particular type of file.
  • Don’t follow website links from unknown email senders.
  • Check the naming convention of website links to ensure that you’re being directed to a legitimate site.

At this stage, nearly everyone knows to conform to these basic rules. However, hackers and cyber threats always seem to get more elaborate, as with the NBC attack last winter. In that case, hackers hijacked NBC.com to be the delivery method of a malware virus; a cyber-criminal imbedded invisible malicious elements across some of the NBC websites, rotating the files periodically to avoid detection.

What happens when malware finds its way onto your computer? Depending on which “exploit kit” is used, various outdated software and plugins are found and targeted. When users have not installed all relevant updates, the exploit kit is able to reside in the system until it finds personal financial data, which is then sold on the black market.

To limit the risk of your company’s websites being attacked by drive-by malware, it is important to monitor payloads. Monitoring payloads makes it possible to detect indicators of an ongoing attack at an earlier point during the attack. It is one of many continuous diagnostic programs an organization should perform in order create a more appropriate layer of security.

Many companies are now hiring cyber threat teams, which are comprised of some of the world’s best former military and government high-profile individuals. These teams help the company build a more secure infrastructure. For companies, though, it is equally important to have correct methods and procedures in place to continually monitor against cyber threats and enforce compliance.

If you do not have the expertise or money to hire full-time individuals to perform this function, there are companies out there that can help you. Below are some that have high marks and are performing well in the security marketplace.

Telos (http://www.telos.com)

Telos offers consulting services, from penetration infrastructure testing and understanding of asset vulnerability to application software assurance. The company provides consulting services and solutions to many government agencies including: the Department of Defense, the Department of Homeland Security, federal agencies and many commercial enterprises. Telos provides extensive services for cybersecurity and solutions for risk management. Some of these services include:

  • Security management, engineering, testing and assessment, as well as assessments of compliance with security protocols.
  • Automation and process enforcement for the management of security risks and compliance.
  • Continuous monitoring and management of security risks with necessary corrective action.
  • Mapping and streamlining assets, along with mapping of vulnerabilities and controls in order to enable security protocol compliance which is continuous in nature.

The information assurance services of Telos are intended to give clients the most reliable and robust security protocols and protections available today. Their services are employed by a wide range of both civilian and governmental agencies, as well as financial sector agencies. Telos is also accredited by US government agencies working in the cybersecurity industry.

FireHost (http://www.FireHost.com)

For companies that are looking for secure cloud hosting, FireHost is one of the best available companies. FireHost has taken the same approach by hiring some of the best military and government cyber experts and bringing them to the private sector. It has developed a proprietary cyber-threat perimeter which surrounds their cloud infrastructure. It contains an artificial intelligence module that pro-actively looks for, and updates in real-time, the latest cyber-threat tactics. By deploying a perimeter system and updating it in real time, FireHost is protecting its clients and their infrastructure better than most companies can do on their own.

FireHost is the leading secure cloud provider, emphasizing security and compliance in order to reduce risk and protect information. Their services are widely used for healthcare and HIPAA compliance, as well as payments and PCI. FireHost’s unique Payment Island also gives companies a secure and compliant way to handle credit card data and payments, allowing them to separate themselves from the risk of security breaches.

Conclusion

While cybersecurity has become a major concern for any company using cloud computing or storage, there are many ways companies and individuals can protect themselves from malicious attacks. Keeping information safe has become the new holy grail of businesses operating online or using any type of payment service or information service where customer information needs to be secure. Without taking the correct steps to ensure their information is safe, companies and individuals alike face the daunting prospect of being hacked and losing valuable information. Ensuring that any cloud or other kind of information storage is safe and secure should be the top priority of every business and individual who does any business, online or off.