The best relationships are built on great communication and mutual understanding – which is why the relationship between federal CIOs and the applications that drive their agencies’ performance is getting more complicated. This week on NVTC’s blog, Davis Johnson, the vice president of public sector at NVTC member company Riverbed Technology explains why it’s important to improve your network visibility.


The best relationships are built on great communication and mutual understanding – which is why the relationship between federal CIOs and the applications that drive their agencies’ performance is getting more complicated.Federal leaders are too often in the dark about which applications are delivering value, which personnel are using them, and how those applications are performing. Agencies simply don’t know their apps very well, and understanding applications begins with gaining visibility into the networks they run on.

The network visibility crisis is getting even more serious as agencies move to the cloud and consolidate data centers. The result is that applications are traveling farther distances across agency networks to reach defense and civilian workers that rely on them every day. Agencies need to make sure they have visibility into the new network paths, and roadblocks, that their applications navigate, or face negative impacts to performance and budgets.

In a Riverbed-commissioned survey conducted by Market Connections, over 50 percent of Federal IT respondents reported that it takes a day or more to detect and fix application performance issues. Furthermore, only 17 percent reported being able to address and fix the issue within minutes.

The costs associated with network outages can be staggering. Today, the average cost of an enterprise application failure is $500,000 to $1 million per hour. This is why it is so important to have good network visibility to identify and fix network and performance application problems as they occur.

Many federal IT executives lack the manpower, budget and tools necessary to find and fix performance issues quickly and efficiently. Without the right tools to monitor network and application performance, federal IT professionals cannot pinpoint problems that directly lessen agency or mission effectiveness. This can mean supply chain delays of materiel to warfighters in the field or lack of access to critical defense and global security applications.

Networks need to perform quickly and seamlessly in order to fulfill mission requirements. Performance monitoring tools provide the broadest, most comprehensive view into network activity, helping to ensure fast performance, high security and rapid recovery.

With visibility across the entire network and its applications, IT departments can identify and fix problems in minutes—before end users notice, and before productivity and citizen services suffer. More than two-thirds (68%) of respondents see improved network reliability as a key value of monitoring tools and more than three-quarters (77%) of respondents said automated investigation and diagnosis is an important feature in a network monitoring solution.

Survey respondents shared which features are important in network monitoring, providing a window into their thoughts about current issues. Those features, listed in order of importance, are capacity planning (79%), automated investigation (77%), application-aware visibility (65%), and predictive modeling (58%).

By improving network visibility, an agency will have improved network reliability, know about problems before end-users do, experience improved network speed, maximize employee productivity, and gain have insight into risk management/cyber threats. Because IT executives will be able to see an agency’s whole network, they can become proactive in not only fixing issues but avoiding them as well.

With today’s globally distributed federal workforce, network visibility is critical to monitoring performance, and identifying and quickly fixing problems.

Using network monitoring tools is a critical step toward managing the complex network environment and ensuring transfers to the cloud are effective and beneficial experiences for the agency, the end users and, ultimately, the constituents.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Top Technology Trends for 2015

March 17th, 2015 | Posted by Sarah Jones in Guest Blogs - (Comments Off)

This week on NVTC’s blog, Davis Johnson, senior director of Public Sector Sales and Business Development at Riverbed, shares his top tech trends of 2015. 


Technology has always been, and will always be an ever-evolving landscape. A decade ago the trends and policies we saw in the private sector greatly differed from those taking shape in the government, but heading into 2015 it is clear that those siloes have been broken down.

With a national focus on cybersecurity, increased usage of the cloud, and a push towards consolidating IT resources to improve efficiency and save money, we can expect the lines between these groups to continue to blur.

Federal CIOs Will Achieve A Broader View Into Cyber Threats
Unless you have been living under a rock you have probably heard about the Sony hack. If you haven’t, chances are you have heard the President at one point or another talk about cybersecurity and its growing importance as it relates to our national security. In fact, at a February 2015 Stanford University appearance the president signed an executive order requesting public sector IT join forces with the federal government and the military in an effort to strengthen overall security across both groups. During this same meeting the president highlighted some alarming statistics—one of which being that overall cyber threats since he took office in 2009 have impacted more than 100 million individuals and businesses.

Given the importance, and emphasis being place on cybersecurity by both government leaders and businesses, it is safe to say that the cyber conversation will only increase, and evolve in the coming years. With that evolution will come increased usage of tools that allow agencies and companies to look across their entire network for abnormalities and catch suspicious behavior before it escalates. These visibility tools will allow network operators and CIOs to see who is accessing what information and when, and if that information is protected or should not be viewed by the user, allows them intervene before any potential leak or hack occurs.

Analytics will also play a major role in future of cybersecurity by offering increased visibility and proactively alerting security teams to potential suspicious activity.  Currently, Intelligence Advanced Research Projects Activity, which conducts research for the U.S. intelligence community, is using public information and Big Data in an effort to actually predict cyberattacks before they occur. This proactive vs. reactive approach is something we can expect to see more of as the public and private sector solidify and sharpen their cyber processes.

The Cloud Will Continue To Mature
Within the government there has been a notable shift from debating on whether or not to move to the cloud, to picking which cloud option best suits an agency’s needs. While Gartner’s “Private Cloud Matures, Hybrid Cloud is Next” report states that hybrid cloud is today where the private cloud market was three years ago, we can expect to see agencies weighing all of their cloud options in 2015 and beyond.

In fact, one cloud option that has long been popular in the public sector and is now gaining popularity in the government is the public cloud. With the Defense Information Systems Agency’s newly released guidelines, the Department of Defense (DoD) now has a clear outline for what they are able to place in the public cloud, as well as what must to be housed within a virtual environment, among other things. With these guidelines we can expect to see a deeper conversation and openness to public cloud offerings within the government and information from both sides housed in the same place.

IT Center Consolidation
With increased virtualization throughout the government, data center consolidation will continue to a hot topic in 2015 and beyond. By consolidating data centers agencies have the ability to reduce costs, improve their security and streamline overall IT processes. In fact, a 2014 U.S. Government Accountability Office report found that of the 24 agencies participating in the Federal Data Center Consolidation Initiative, 19 agencies collectively reported achieving an estimated $1.1 billion in cost savings and avoidances between fiscal years 2011 and 2013.

While there are obvious benefits that data center consolidation brings, the shift also means that applications are now hosted farther away from employees or federal workers that rely upon them every day. That distance, and the increasing complexity, require networks to keep pace. So federal CIOs and companies will look for tools to assist in consolidating their datacenters over the next few years. These tools will be ones that empower visibility into app and network performance issues, and those that help solve bottlenecks to make sure workers have access to the apps they need so productivity doesn’t suffer. To ensure that consolidated data centers are providing maximum benefits for IT leaders on both sides, we can expect to see them implement optimization tools moving forward as data center consolidation is definitely here to stay.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Notes from the Silicon Valley Cybersecurity Summit: Part 2

September 30th, 2014 | Posted by Sarah Jones in Guest Blogs - (Comments Off)

NVTC is inviting members to serve as guest bloggers, sharing insights and information on trends or business issues relevant to other members. Kathy Stershic of member company Dialog Research & Communications shares her insights below.


While the policy panel discussion at the summer’s Silicon Valley Cyber Security Summit pointed out the many challenges of governments trying to deal with the cyber threat, the second ‘Next Generation’ panel was all about the shortage of qualified talent to deal with the problem.

The good news – cyber presents a great career opportunity! As in, the industry needs lots of help. Now. The not so good news is that 40 percent of open IT security jobs in 2015 will be vacant. There simply aren’t enough qualified people to fill them. Technologies such as new threat intelligence and attack remediation products will continue to advance. That will help automate intervention, but there is still a need for people to skillfully apply them, and for others to create them in the first place in the face of a never-ending game of new threats. One speaker said that, as of only a couple of years ago, a new malware was detected every 15 seconds. Now two new malwares are detected every one second! The speakers expected that pace to accelerate exponentially.

There are a growing number of formal university programs in this area, but I was very surprised to hear that only 12 percent of computer science majors are female, and that population has been steadily shrinking for two decades. A marginal percent of those study cyber. So we’ve got a challenge with public engagement in the issue, an inadequate talent pool, and almost half of the student population not thinking about the problem.

Of course not all software learning is in the classroom and talented hackers do emerge. That is why General Keith Alexander [former head of U.S. CyberCommand] went to least year’s Black Hat Conference – while unconventional, he knew this is a place to find badly needed talent. There are also several incubator initiatives like  Virginia’s Mach37, and many startups are trying to get off the ground.

Another challenge is that CEOs don’t fundamentally understand the complex cyber problem, so they delegate the task to the CIO. [This reminds me of similar dispositions toward Disaster Readiness and Business Continuity Planning pre-9/11]. Cyber threat is another form of business risk and should be planned for as such. One speaker mentioned that there is expert consensus, even from VCs who are scrupulous about how money is spent, that for a $100 million IT budget, 5-15 percent should be spent on security. While panelists noted cyber threat is a top discussion point for many corporate boards, there is uncertainty about what to actually do to prepare.

This is a tough issue all the way around. One speaker suggested repositioning the brand message to what regular folk will respond to – protecting our national treasures, homes and quality of life, critical infrastructure and national security. Nick Shevelyov, Chief Security Officer of Silicon Valley Bank, summarized the issue: ‘the technology that empowers us also imperils us.” I’m hoping more of us come to understand that and step up.


Contributed by Kathy Stershic, Principal Consultant, Dialog Research & Communications

kstershic@dialogrc.com

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS