Protecting Data at Its Core

May 20th, 2016 | Posted by Sarah Jones in Guest Blogs | Member Blog Posts - (Comments Off)

This week on NVTC’s blog, Richard Detore of GreenTec-USA discusses the deep concerned over recent cyber-attacks and offers a solution to prevent data damage.


picforblogEveryone in the cybersecurity field – both inside and outside of government – is deeply concerned over the kind of cyber-attacks that hit federal agencies such as the Office of Personnel Management (OPM) and private companies such as Sony. Rightly so, government agencies and private companies continue to make large investments in cybersecurity.

This sense of urgency extends to America’s key infrastructure, as underscored last October when President Obama issued a Presidential Proclamation on Critical Infrastructure and Resilience. In that proclamation, the president noted that

“Our Nation’s critical infrastructure is central to our security and essential to our economy. Technology, energy and information systems play a pivotal role in our lives today, and people continue to rely on the physical structures that surround us. From roadways and tunnels, to power grids and energy systems, to cybersecurity networks and other digital landscapes, it is crucial that we stay prepared to confront any threats to America’s infrastructure.”

Last year, in testimony before the Senate Armed Services Committee, Director of National Intelligence, James Clapper, noted how cyber-attacks threaten public and private sector interests:

“Most of the public discussion regarding cyber threats has focused on the confidentiality and availability of information; cyber espionage undermines confidentiality, whereas denial-of-service operations and data-deletion attacks undermine availability. In the future, however, we might also see more cyber operations that will change or manipulate electronic information in order to compromise its integrity…instead of deleting it or disrupting access to it. Decision making by senior government officials (civilian and military), corporate executives, investors, or others will be impaired if they cannot trust the information they are receiving.”

And in his most recent appearance before the Senate Armed Services Committee, Clapper stated that “Cyber threats to U.S. national and economic security are increasing in frequency, scale, sophistication and severity of impact.”

According to a recent study published by the cybersecurity firm Tripwire, 82 percent of the oil and gas companies surveyed said they saw an increase in successful cyberattacks over the past year. More than half of the same respondents said the number of cyberattacks increased between 50 to 100 percent over the past month.

Last year, federal investigators uncovered the fact that Russian hackers had penetrated the U.S. State Department in a major cybersecurity breach that gave Russian hackers access to the White House – including the President’s schedule.

Other threats, such as ransomware, are now on the radar screen of key policy makers in Congress, as well as the U.S. Departments of Justice and Homeland Security. Ransomware encrypts a computer user’s information, and hackers then demand payment – usually in the form of crypto-currency such as Bitcoin (which is extremely difficult to trace) – to unlock the information.

In fact, in recent years several police departments have fallen victim to ransomware and have had to make payments to the hackers. One typical example happened in Maine when two police departments were hacked into. To date, the perpetrators in these cases have not been apprehended.

Obviously, protecting and securing data at its core is a key component of cybersecurity efforts for both the public and private sectors. While it is important for cybersecurity efforts to focus on improving detection and enhancing firewalls, one approach that may often be overlooked is better protecting data at its core.

picforblog2Until recently, it was not possible to fully protect data at its core –the hard drive. In 2013, Write-Once-Read-Many (WORM) disk technology was developed and successfully installed that now, for the first time, allows government agencies and private companies to safely secure and protect data at the physical level of the disk. Any and all data stored on a WORM disk cannot be altered, overwritten, reformatted, deleted or compromised in any way within a computer or data center. The WORM disk functions as a normal Hard Disk Drive with zero performance degradation from its additional built-in capabilities. These capabilities prevent data damage from any form of cyberattack.

This new breakthrough combined with encryption makes it impossible for hackers to steal data or render it useless by attacking the stored data, or disks.

In addition to advances in malware and firewall enhancements, comprehensive cybersecurity efforts should take a close look at technologies that protect data at its core. Such efforts will impact the public and private sectors in profound ways.

Richard Detore is a NVTC member and CEO of GreenTec-USA, a technology company based in Reston, VA.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

LeaseWeb’s  introduces six tips to help you find the perfect cloud partner for your business.


By now, the concept of the cloud is ubiquitous, but for many business leaders the idea still presents more challenges than opportunities. Understanding the complicated technology, not to mention the vast array of delivery models, degrees of services and levels of security available, can be a daunting task for companies under pressure to adapt or adopt.

In a new white paper, “Developing a Cloud Sourcing Strategy: Six Steps to Select the Right Cloud Partner,” LeaseWeb gives decision makers the tools they need to formulate an effective cloud strategy or to identify the right cloud partner to executive it. In summary form, these six tips will help you find the cloud partner for your business.

  1. Support and services — For most businesses, concerns about cost, security, vendor management and technology take the lead in the search for a reliable cloud partner. Surprisingly, the ability of  a provider to smoothly and effectively deliver customer support, SLAs and managed services is often minimized or overlooked, at the expense of the customer. When deciding which cloud partner best fits your needs, don’t underestimate the crucial importance of the support and services they make available. It’s the difference between a cloud partnership that takes your business to new levels and one that just adds to your daily hassles.
  2. Architectural alignment — One of the biggest considerations is whether to use a hyper-scale or traditional hosting model. Practically speaking, a hyper-scale provider requires users to be responsible for operational, day-to-day tasks, while hosting providers oversee the day-to-day management of the infrastructure elements. It’s up to you to decide which is a better fit for your technical team and business needs.
  3. Security and compliance — Data centers are a frequent target of malicious attacks, so it’s important to make sure that your cloud provider is prepared for every eventuality. This means everything from physical security and network threat recognition, to regular security audits to updated compliance certifications like HIPAA. Your data is your most valuable asset, so make sure it’s going to be treated that way.
  4. Support for data sovereignty and residency requirements — In tandem with security and compliance issues, data residency is another issue that frequently stalls cloud and hosting projects. The growth of “bring your own device” (BYOD), big data and cloud projects is dragging sensitive data to third-party clouds and data centers. This makes many business owners uneasy, which is why it’s so important to address the location of your data, the laws governing the export of data wherever it’s stored and the security and encryption of that data.
  5. Financial management — Traditional hosting companies typically offer a more basic cost scheme, based upon initial configurations with monthly utilization. This traditional model works well for companies with steady and predictable usage patterns. Hyper-scale cloud services, on the other hand, were built around granular per minute or hourly costs from their inception. Provisioning is primarily self-service and allows users to turn up server, storage and network services. This feature appeals to users who need to spin up environments in near real time and then turn them down when not needed. Consider your requirements to determine which model fits you – or if you want a mix of both.
  6. Cultural and strategic alignment — Cultural fit with your service provider is a key point that never receives enough attention in the RFP process. For nearly all enterprises, using a cloud or hosting provider is truly a new venture, one that requires extensive internal buy-in. For first-time cloud buyers, the ongoing degree of partnership is an unknown factor. Each provider engages and on-boards clients differently.

If you’re in the process of picking a cloud partner for your business, remember that no one becomes a cloud infrastructure expert overnight. But with a smart approach, you can make an informed decision that will lead to great results for your company.

Ultimately, remember that you will only achieve the higher-performance and lower-cost environments you are aiming for by choosing the provider that fits your needs and requirements best.

To learn more, visit us at here to receive our full white paper on selecting the right cloud partner today.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 This week on NVTC’s blog, George Lavallee of NeoSosytems offers five challenges and solutions to help customers receive the most out of CER.


Deltek Costpoint Enterprise Reporting (CER) provides customized, data-driven business intelligence (BI). Fueled by the IBM Cognos 10 analytics engine, CER enables organizations to mine their Costpoint data to provide consistent snapshots of performance, view historical trends and predict results.

CER is the king of BI. With state of the art capabilities, there are also potential roadblocks (employee turnover, user experience levels and changing data/accounting needs, other systems within the enterprise) that can keep companies from fully reaping these benefits. Unresolved, these challenges can cause inaccuracies, consume months of time and erode leader confidence.

1.   Powerful Customization Capabilities

While running CER reports is simple, developing them can be challenging  for users unfamiliar with Costpoint’s underlying data structure. To solve this problem, Deltek created standard reports (CER Reports) covering everything from project management to payroll to procurement. These prebuilt templates enable users to quickly generate reports that capture the most commonly used  fields across a wide swath of businesses. However, they may not include user-defined fields or specific metrics that you have implemented for your specific needs.

To capture these data, enterprises will want to build custom reports or modify existing standard reports. Creating or modifying Cognos reports requires a strong working knowledge of both the Cognos tool as well as the structure of the Costpoint database. For example, you can’t simply click a button to tailor a report to your accounts payable process or labor management structure. You will need to understand where the pertinent data elements reside and how to access them using the Cognos toolset. Many intermediate users lack the skills to effectively craft custom reports.

2. Complexity of Government Contracting Accounting Data

Costpoint uses more than 1,800 inter-related data tables that capture a wealth of information about your company. Access to this complex store of data can be of great benefit to your organization, but creating a report that captures the data relevant to your needs challenges many organizations. Many users are unsure about which data to query and how to convey it on a well-designed report. Common questions include:

  • What data tables do I access?
  • How do I arrange the data?
  • Which charts do I use?
  • What rendering options are best?

Additionally, most organizations have budgets and forecasts and want to integrate this data with actual results within their reports. You may have used another system to create your budgets, such as TM1, Adaptive or even Excel. Accordingly, integrating this data into reports generally means pulling data from systems outside Costpoint, further compounding complexity.

3. Robust Security

Increasingly in today’s world, data breaches are affecting companies in all sectors. Breaches tarnish a company reputation, expose data and sabotage audit requirements. Fortunately, Cognos and CER deliver robust and highly configurable security controls. The hundreds of available settings, however, can stymie many organizations.

Novice and even intermediate users may not realize that out-of-the box Cognos installations may not incorporate security settings that are optimal for their company’s situation. Organizations could inadvertently expose proprietary data and confidential employee information. For instance, you might assume that granting access to the projects package would only enable users to see project-related data but close examination of that data reveals that confidential employee information may be included if it is not properly secured by appropriate user-specific security profiles. The default security settings may not be sufficient to provide the degree of security required in rapidly changing environments.

4. Analytics Development and Optimization

Unless you have a dedicated analytics staff with the required expertise, individuals developing your reports may not be effective. Why? Developing reports and maximizing efficiency requires experience with both Costpoint and Cognos.

Often, employees tasked with reporting business intelligence have other duties. CER management is a part-time responsibility. They may have deep functional knowledge, but minimal understanding of Costpoint data structures and Cognos query and reporting requirements.

If you are caught in this situation the results can be challenging. Inexperienced users take 10 to 20 times longer to develop a report than an expert. Reports can be late compressing the time available for meaningful analysis as well as diverting time away from other business-critical duties, which may better align with their hired skillset.

5. Meaningful Results

Inexperience can also cause inaccuracy in reported results. Novice users may query the wrong data or omit data that would dramatically improve the reports usefulness. Such mistakes could put contracts at risk or result in poor or ill-informed business decisions. You want your reports to carry the most meaningful data possible.

Imagine you’ve tasked your logistics manager with developing an incurred cost submission report. That individual skillfully maintains your supply chain. But he or she doesn’t use Costpoint every day and may not understand which direct and indirect cost tables to query. Your report might miss critical costs or include unallowable items.

Errors like these erode leader confidence. Just one inaccurate report, and senior managers may mistrust all your data outputs. You’ve damaged your reputation and possibly jeopardized your contracts simply because you didn’t fully understand the data structures and how to best capture the data that conveys the most meaning.

Signs You Need Help

How do you know if these challenges are hindering your data analytics? Talk to your users and business managers. If you hear the following, you are probably underutilizing the power of CER or you might need a CER tune-up.

  1. Reports are consistently late.
  2. More time is spent collecting data than analyzing it.
  3. Executives don’t trust data accuracy.
  4. Significant manipulation is required to analyze data.
  5. Reports have unusually long run times.
  6. Project managers lack administrative visibility (i.e., they can’t see unpaid invoices, approaching funding ceilings, missing bill rates, etc.)

Consequences of Inaction

Inexperienced users waste time collecting the wrong data and not enough time analyzing results. Inaccuracies cause executives to doubt the validity of all your reports. Poor security can expose proprietary data and compromise audit results. Depending upon the number of users, solving these challenges could save your company months of labor and lead to better, more timely business decisions.

If you owned a Telsa, you would make sure you were trained and educated in how to fully utilize it. At a minimum you would take it to experts to make sure it was operating efficiently and effectively.

What You Need to Know:

To fully benefit from Deltek CER, companies should routinely assess their CER configurations, processes and output. A CER tune-up is easy and should be a standard operating procedure within all Deltek organizations. Maximizing the power of CER will allow companies to reap substantial benefits, overcome any “obstacles” and enable their organizations to succeed.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Preparing for Genomics and Health Data Analytics

February 1st, 2016 | Posted by Sarah Jones in Guest Blogs - (Comments Off)

This week on NVTC’s Blog, LMI Senior Consultant Daniel DuBravec notes that we need to prepare for personalized medicine and the evaluation of genomic data.


Today’s electronic health record (EHR) systems cannot properly handle genomic data. Interpreting these huge and complex data, particularly in a visual manner, is challenging. Even when EHR systems can access these data, few standards exist for how to structure them to ensure seamless system integration, interoperability, and interpretation. Most medical schools do not teach doctors how to interpret genetic data, and local-level care centers require training on proper data storage and network security.

Precision medicine predicts, prevents, and treats diseases at the patient level. Its growth has created the need for internationally recognized genomic EHR standards and policies, which would protect individuals by ultimately improving patient outcomes. We need to prepare for a future in which medicine is more personalized and better able to evaluate genomic data. 

Real, Inspiring Stories

Recently, I met a colleague whose daughter is suffering from a genetic condition known as Stargardt disease. Sadly, her daughter is rapidly losing her vision. This disease, a form of juvenile macular degeneration, can only appear in children when both parents carry the mutated gene. If the gene had been identified at an early stage, medical practitioners would have had more time to investigate new drug therapies and gene-editing technologies to treat my colleague’s daughter. As part of her interoperable medical genetic record, physicians at research institutions who were also working on her case could have then viewed and collaborated by using this critical information. Hitting close to home, this is one of many stories that inspire us to prepare for the widespread application of precision medicine and genomic data analysis.

Making Genomic Data Useful for Medical Practitioners

The future of patient care requires connecting large external data sets with electronic healthcare records. Precision medicine will customize treatments down to a patient’s genes and behavior. By analyzing genetic data across thousands of people, scientists will discover preventative treatments and cures for challenging health issues.

Given the complexity of health and genomic data, one can analyze the same data in different ways and achieve different outcomes. “Well-designed data visualization could help doctors interpret the data more rapidly, arriving at more challenging diagnoses in less time,” says Erin Gordon, data visualization trainer and graphic facilitator at LMI.

Before developing a framework for integrating and analyzing disparate health data sets, we test our models for validity. “The quality of our medical data models has a direct impact on patient outcomes and daily operations in medical facilities,” says Brent Auble, a consultant with the Intelligence Programs group at LMI. To support LMI’s research into healthcare data management, our team set up a Hadoop cluster, which is a group of servers designed to quickly analyze massive quantities of structured and unstructured data.

Building the Future of Healthcare Analytics

Ultimately, to meet the growth in precision medicine and the use of health data analytics, future EHR systems need to:

  • automatically generate comparisons of multiple genomes,
  • identify and match genetic variants based on known diseases,
  • ensure patient data privacy, and
  • integrate and search medical publications and scientific research for relevant patient data.

Preparation is key in order to predict, prevent, and treat disease as medicine evolves.


Dan DuBravec is a senior consultant at LMI, leading IT implementation projects. Mr. DuBravec holds multiple EHR certifications, as well as a BS in product design from Illinois State University and an MS in educational technology leadership from George Washington University.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week on NVTC’s Blog, Business Development, Marketing & Sales Vice Chair Jenny Couch of member company Providge Consulting shares critical changes to the IT landscape that your healthcare organization needs to have on its radar.


These days, technology seems to advance too rapidly for most of us to keep up. It’s certainly moving too rapidly for organizations to keep up with every single one of the “hot” trends.

healthITIn the noisy field of today’s latest tech, it’s all too easy to get caught up in the buzzwords and lists of “This Year’s Hottest IT Trends”, and miss the truly critical changes to the IT landscape that your organization needs to have on its radar.

The healthcare industry is uniquely positioned to be impacted by a convergence of critical IT trends within the coming years. But with budgets decreasing, and resource pools shrinking, it’s more challenging than ever to prioritize IT needs within the healthcare space.

We’ve highlighted the top five technology trends healthcare organizations must have on their radar in 2016.

  1. Cloud computing. Whether it’s a pharmaceutical company needing to store large amounts of data from clinical trials, or a hospital with a newly implemented EHR system, healthcare organizations of all kinds are increasingly turning to cloud computing for a variety of uses. According to Healthcare Informatics, the global healthcare cloud computing market is expected to reach $9.5 billion by 2020. And 83 percent of healthcare organizations are already leveraging the cloud. Only 6 percent of organizations have no plans to take advantage of the cloud in the coming years. If you’re in that 6 percent, it’s time to reconsider your plans. Cloud computing can be used to decrease costs, improve access, and create a better user experience for any healthcare organization. But, it’s critical that your organization take a strategic approach to moving to the cloud. Learn more about how you can leverage the cloud to best support your organization here. 
  2. The Internet of Things. Take a look at that FitBit on your wrist. Think about the incredible amount of data that one tiny device is generating constantly. The number of steps you take, the calories you burn, your sleep pattern, the stairs you climbed. These devices get more accurate and more intricate with every passing day. We are not far off from a future when we’ll be able to monitor nearly every aspect of our health, and the health of our loved ones without setting foot in a doctor’s office. Healthcare organizations will have to find a way to address what will be tectonic shift in how care is delivered. Communication methods will need to be established to collect the data generated by wearable and mobile devices. Methods for collecting and analyzing the influx of data will need to be developed so patterns can be identified. The manner in which treatment is delivered will have to change as we move away from the traditional doctor’s office visits, and into a world where a diagnosis can be made through analyzing the information generated through a patient’s mobile device, car, appliances, wearables, etc. And while this future may not quite be a reality, it’s coming soon, and healthcare organizations need to start preparing today.
  3. Data Explosion. Big data. Data analytics. Whatever term you use, the unparalleled rise in the amount and accessibility of data over the past few years is certain to have a massive impact on the healthcare industry. The explosion in big data occurred so quickly that 41 percent of healthcare executives say their data volume has increased by 50 percent or more from just one year ago. 50 percent in just one year. This incredible increase in data will allow medical professionals to more quickly and more accurately diagnose patients, but as with the Internet of Things, it will require fundamental shifts in how data is managed and how care is administrated. Healthcare organizations will need to train, or hire a workforce with the right data analysis  and medical skill sets. Regulations, processes, and platforms will need to be developed or implemented. Healthcare organizations who ignore this trend do so at their own peril. For as Accenture notes in a report released earlier this year for those who take advantage of the wealth of opportunity within big data, “Greater operational excellence and improved clinical outcomes await those who grasp the upside potential.”
  4. Efficiency in IT. If you haven’t heard the phrase “Doing more with less”  in the past few months, it’s probably time to climb out from under that rock you’ve been living under. With healthcare spending wildly out of control in the United States, every healthcare organization from physician’s offices to the largest hospital chains are being asked to do more with less. IT is a particularly ripe area for cutting costs, and resources. In 2016, the emphasis on doing more with less in IT will continue. Expect to see IT departments pursue options such as moving to the cloud, outsourced managed services, and bring your own device to help decrease IT operating costs.
  5. Cybersecurity. In 2014, 42 percent of all serious data breaches occurred at healthcare organizations. Sadly, this trend is certain to continue its upward trajectory in the coming years. Healthcare organizations who have not adequately upgraded their systems, and developed a thorough cybersecurity strategy are especially vulnerable to attack. Now is time to evaluate your systems, processes, and resourcing. Make sure your organization is positioned to proactively protect against attacks where possible, and identify and respond rapidly to breaches when they do occur.

Planning your 2016 health IT projects and priorities? Looking for a partner that will truly understand the challenges you are facing and the need to ensure success? Get in touch with us today. Our experienced health IT experts know the obstacles you face, and are ready to partner with you to deliver your projects on time, and on budget in 2016 and beyond.


Jenny Couch

This post was written by Jenny Couch. Couch is a project management consultant, and Providge’s Business Development Manager. She loves efficiency, to-do lists, and delivering projects on-time and on-budget.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week on NVTC’s blog, Kathy Stershic of member company Dialog Communications shares her final thoughts of her Brand Reputation in the Era of Data series.


Over the past few weeks, I’ve outlined 8 Principles that will help marketers protect and strengthen their brands in an era of radical change, where there is great temptation (and quite likely management pressure) to push boundaries further than ever before. Throughout this time and many preceding months, I’ve had countless conversations with people about the state of their data as well as the modern conveniences upon which they’ve come to rely. I’ve heard a Big Data expert actively advocating for stretching the law (or hinting at crossing the line) for the sake of competitive advantage. I’m sure he is not alone in that opinion. We are, all of us, currently in the Wild West.

While technology is accelerating what’s possible, the ideas outlined in the 8 Principles come back to common fundamental and timeless human needs that will outlast every wave of technology: People protecting what’s theirs, seeking respect and dignity, wanting control of their lives, enjoying freedom and avoiding harm. The brands they will choose for anything more than a one-time experience will be those who understand those concerns, and actively work to enable them.

There is more to brand reputation than being the app of the moment. Not every new thing will be transformational. But businesses who innovate as well as who truly respect their customers and actively work to earn trust stand a far greater chance of longevity than those who rely on buzz about the shiny new object, or who exploit to maximum advantage thinking the ‘sheeple’ won’t notice. It will take work. It will take awareness. It will take intention. It will take courage. And it will take leadership.

Eventually today’s Wild West will give way to a more mature market dynamic. Embracing these 8 Principles may help ensure your company is there when that time comes – or even leading the way.

Brand Reputation in the Era of Data – Principle 1: Empower Customer Control
Brand Reputation in the Era of Data – Principle 2: Be Clear and Accountable
Brand Reputation in the Era of Data – Principle 3: Do Everything You Can to Protect Customer Data
Brand Reputation in the Era of Data – Principle 4: Mind Your Partners!
Brand Reputation in the Era of Data – Principle 5: Practice Customer Empathy
Brand Reputation in the Era of Data – Principle 6: Comply with All Applicable Laws and Regulations. Then Exceed Them.
Brand Reputation in the Era of Data – Principle 7: Apply Technology Thoughtfully
Brand Reputation in the Era of Data – Principle 8: Actively Demonstrate Respect for Your Customers

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week on NVTC’s Blog, Host in Ireland’s President and Founder Garry Connolly discusses Safe Harbor, a policy agreement that regulated the way that U.S. companies export and handle the personal data of European citizens.


 In Nov. 2000, the United States Department of Commerce and the European Union established Safe Harbour, a policy agreement established between that regulated the way that U.S. companies export and handle the personal data of European citizens. This enabled American technology companies to compile data generated by their European clients in web searches, social media posts and other activities online.

In 2015, a major issue with the agreement is the collection of personal data that people create when they post something on Facebook or other social media, when they conduct searches on Google, or when they order products or buy movies from Amazon or Apple. Such data is invaluable to companies, which use the information for a broad range of commercial purposes, including tailoring advertisements to individuals and promoting products or services based on users’ online activities.

Safe Harbour, regarding data transfer, does not apply solely to tech companies or online retail, however. It also affects any organization with international operations, such as when a company has employees in more than one country and needs to transfer payroll information, or allow workers to manage their employee health benefits online.

So how did we get from data concerning your preference for wool socks over cotton or your interest in purchasing season four of “Game of Thrones” to controversial issues concerning Europeans’ privacy rights and U.S. national security interests?

As Helen Dixon, Ireland’s Data Protection Commissioner, pointed out in a statement issued by her Office, the issues dealt with in the decision by the Court of Justice of the European Union (ECJ) to invalidate the “Safe Harbour” system, under which companies transfer customer data from Europe to the United States, are “complex.” She elaborated, saying that the issues “will require careful consideration” and “what is immediately clear is that the Court has reiterated the fundamental importance attaching to the right of individuals to the protection of their personal data. That is very much to be welcomed.”

The ruling by the ECJ found that the Safe Harbour agreement is flawed because it allowed American government authorities to gain access to Europeans’ online information. The court said leaks from Edward J. Snowden, the former contractor for the National Security Agency (NSA), made it clear that American intelligence agencies had access to the data, infringing on Europeans’ rights to privacy.

The issue came to head when Max Schrems, a 27-year-old graduate student living in Austria, argued that Europeans’ online data was misused by Facebook because it cooperated with the NSA’s Prism program. Prism, in part, involved the U.S. Federal government’s collection of information on Europeans, gathered from the world’s largest Internet companies, in search of national security threats. An interesting side note is that Schrems originally filed his complaint with the Irish Data Protection Commissioner, since it is the privacy regulator for Facebook outside the U.S. because the Company’s European headquarters are located in Dublin. He eventually took his case to the Irish High Court, which referred it to the ECJ in July of last year. Following the ECJ judgment, the Irish court is expected to rule that the Irish Data Protection Commissioner must investigate his complaint properly and decide whether to suspend such data transfers.

As many large American tech companies have set up their overseas headquarters in Ireland, the Irish government has been supportive of Safe Harbour. However, Helen Dixon has begun discussions with her data protection counterparts in other European Union member countries to best determine how the ECJ’s judgment can be “implemented in practice, quickly and effectively” as it impacts European to U.S. data transfers, Host in Ireland is confident that procedures can be established that continue to support the thriving digital economy, respects individuals’ right to privacy, and ensures the safety and protection of our global community, both home and abroad.


 Interested in learning more about data protection? Join NVTC, Host in Ireland, and Helen Dixon for an event on April 7, 2016 at the National Conference Center. Please email marketing@hostinireland.com for details.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week on NVTC’s blog, Kathy Stershic of member company Dialog Communications continues her Brand Reputation in the Era of Data series by sharing principle eight: actively demonstrating respect for your customers.


The final of these 8 Principles clarifies a concept implied across the other seven. To become and remain a successful brand, businesses must actively demonstrate customer respect. Just saying ‘We respect our customers!’ is not enough. Prove it.

This can take many forms, from being transparent and honest about data collection and sharing practices to moderating your outreach below the annoyance level to integrating this attitude into your culture and policies – and many other opportunities mentioned through these posts.

Disrespectful practices were often brought up in the comments I’ve gotten. One respondent noted that “I want to feel like a vendor respects my data as much as I do.” People do not like bait-and-switch, confusing changes to privacy policies or anything that feels sneaky. They don’t like the burden of responsibility to stop something, like too much email or too many pop-ups. When everyone is tired or busy from their own lives, wearing people down or hoping they won’t notice might produce a short term win, but not long-term loyalty.

Having a straightforward dialog with your customers – even the ones who are unhappy with you – is another way to show respect. Everyone messes up – own it! Apologize, make it right and move on. If it wasn’t your fault, but there’s a small cost to making someone feel respected anyway – do it! Nordstrom figured this out a long time ago.

Nothing about customers wanting to feel respected and treated fairly is new. What is new is the exponential increase in vendor relationships enabled through technology. With the tremendous choice the modern customer enjoys, utility, benefit, quality and value are now table stakes. A differentiated and trusted experience, that includes feeling respected, is what will stand out. Someone’s choice of your product or service is a privilege. One of the best quotes from the respondent feedback sums it up: “Respect the customer and the customer will respect you.”

Brand Reputation in the Era of Data: 8 Principles for Responsible Data Stewardship That Won’t Kill Your Customer Relationships
Brand Reputation in the Era of Data – Principle 1: Empower Customer Control
Brand Reputation in the Era of Data – Principle 2: Be Clear and Accountable
Brand Reputation in the Era of Data – Principle 3: Do Everything You Can to Protect Customer Data
Brand Reputation in the Era of Data – Principle 4: Mind Your Partners!
Brand Reputation in the Era of Data – Principle 5: Practice Customer Empathy
Brand Reputation in the Era of Data – Principle 6: Comply with All Applicable Laws and Regulations. Then Exceed Them.
Brand Reputation in the Era of Data – Principle 7: Apply Technology Thoughtfully

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week on NVTC’s blog, NVTC member company Kathy Stershic of Dialog Communications continues her Brand Reputation in the Era of Data series by sharing principle seven: applying technology thoughtfully while preserving customer data. 


Recently, Chapman University published the results of its survey America’s Top Fears 2015. Respondents were asked their fear level about different factors ranging from crime to disasters to their personal futures. FIVE OF THE TOP TEN THINGS PEOPLE FEAR ARE RELATED TO MIS-USE OF THEIR DATA! That includes cyber-terrorism, corporate tracking of personal information, government tracking of personal information, identity theft and credit card fraud. That’s out of 88 possible things to be afraid of!

There is a tidal wave of automation being applied to data collection and usage practices. I suggest that just because you can do something doesn’t always mean you should. We are approaching a tipping point around the creep factor of having everything that one does be tracked. People are tired of constant advertisements, witnessed by the increased adoption of ad blocking technology, and especially Apple’s recent iOS 9’s robust blocking capability for Safari – which has been heralded as the potential death of online advertising. As ads are blocked, marketers need to find other ways to get their message through, such as direct contact with mobile devices. That will require permission from each user. And that means you have to be delivering a lot of value while also showing some restraint in the level and frequency of contact.

Another interesting wrinkle is the October 6 ruling by the EU Court of Justice that struck down what is called Safe Harbor, a policy that allowed self-certification by U.S. companies to say their data protection standards were sufficient for EU citizens, who are protected by strict privacy law. Israel followed suit on Oct. 20. What happens next is yet to be determined, but everyone is scrambling to figure out how to protect their international business by the end of January grace period.

When practices get abused, people fight back or tune out. It’s human nature. In e-chatting during a webinar this week with its moderator Chris Surdak, a big data expert, (who I thought discussed unbridled capitalism more extremely than anyone I have ever heard), he noted regarding privacy that “The backlash will be epic, if we ever get there.” Hmmm. A thoughtful approach to what you collect, how you collect and use it, how long you keep what you collect, with whom you share it and what they do with it will better serve and protect your business and your brand through changes in customer sentiment and the regulatory environment.

Brand Reputation in the Era of Data: 8 Principles for Responsible Data Stewardship That Won’t Kill Your Customer Relationships
Brand Reputation in the Era of Data – Principle 1: Empower Customer Control
Brand Reputation in the Era of Data – Principle 2: Be Clear and Accountable
Brand Reputation in the Era of Data – Principle 3: Do Everything You Can to Protect Customer Data
Brand Reputation in the Era of Data – Principle 4: Mind Your Partners!
Brand Reputation in the Era of Data – Principle 5: Practice Customer Empathy
Brand Reputation in the Era of Data – Principle 6: Comply with All Applicable Laws and Regulations. Then Exceed Them.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week on NVTC’s blog, NVTC member company Kathy Stershic of Dialog Communications continues her Brand Reputation in the Era of Data series by sharing principle six: comply with all applicable laws and regulations - then exceed them. 


There are a LOT of laws and regulations out there that govern data handling and privacy. They vary according to where you conduct business. The European Union has the strictest set of laws that are built on the principle of human rights. The United States has what’s called a sectoral approach, that is different laws are set for different sectors – like HIPAA for healthcare, Gramm Leach Bliley for Finance, the Cable TV Privacy Act, the Electronic Communications Privacy Act and on. In the US, 47 of 50 states also currently have data breach notification laws, all of them slightly different. Asian countries adopt data protection laws and sectoral laws. Many Latin American countries have constitutional guarantees, data protection laws, and sectoral laws. Yikes! It’s a lot to comply with – and just to keep things fun, laws and regulations are changing and updating all the time.

Realistically, marketers are not going to know every legal requirement that impacts their organization. But you should at least be aware of the basic principles of what’s allowed in the places you do business, then coordinate with Legal (I know, I know!) on how to stay out of trouble. This discovery can also happen through a process called a Privacy Impact Assessment, mentioned in my previous post.

Observing laws and regulations must be standard operating procedure. But just being compliant really isn’t enough to enhance your position in a fickle and frenetic market. Think about it this way – do you want your child to just stay out of trouble at school, or be a leader in the classroom? Where’s the attention going to go? You sure don’t want to stand out in a bad way – like being one of the 256 app providers who violated the privacy terms they contracted with Apple.

Going beyond the legal minimum and making extra effort will help your business differentiate as a trusted source. Simplified privacy policy language will help. Minimizing data collection and retention (yes, you CAN get rid of stuff!) will help. So will being transparent at all times about your practices and behaviors. Use creative ways to tell the story to your customers and stakeholders – through vignettes, through messaging, through customer service scripts – put it out there. Earning trust marks like TRUSTe really sends the message that you take data stewardship seriously.

Your customers expect you to comply with the law. They want to feel like you care and are proactive about protecting their data. I firmly believe that the great majority of people want to do the right thing; it comes back to mindfulness and balance between enthusiastic pursuit of business objectives and a bit of thoughtful restraint.

Brand Reputation in the Era of Data: 8 Principles for Responsible Data Stewardship That Won’t Kill Your Customer Relationships
Brand Reputation in the Era of Data – Principle 1: Empower Customer Control
Brand Reputation in the Era of Data – Principle 2: Be Clear and Accountable
Brand Reputation in the Era of Data – Principle 3: Do Everything You Can to Protect Customer Data
Brand Reputation in the Era of Data – Principle 4: Mind Your Partners!
Brand Reputation in the Era of Data – Principle 5: Practice Customer Empathy

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS