Gartner predicts there will be an estimated 8.4 billion IoT devices by 2020. Tenable President, Chief Operating Officer and Co-Founder Jack Huffard discusses how the proliferation of digital assets and connected devices are creating an exposure gap in cyber defenseand shares how organizations can fight back against cyber-attacks. Huffard participated on the Successful Cybersecurity Growth Companies In the Region panel at the Capital Cybersecurity Summit on Nov. 15, 2017.


jack-huffard-2015-2-webIt’s been more than two years since the Office of Personnel Management (OPM) disclosed one of the largest data breaches in history, but just last week, the agency’s inspector general gave them a failing grade when it comes to critical areas like risk management and contingency planning.

In addition, the data breaches and attacks we’ve recently seen across a variety of industries, including entertainment, critical infrastructure, retail and finance, make it clear that all organizations are still failing when it comes to basic cyber hygiene.

Today, a company’s assets range not just from laptops to servers, but include mobile devices, internet-connected appliances and the cloud. The latest research shows the number of these assets are only going to increase. For example, Gartner predicts there will be an estimated 8.4 billion IoT devices by 2020. And according to a 2016 IDG Enterprise Cloud Computing Survey, 70 percent of organizations already have apps in the cloud and 16 percent more will in 12 months. This modern, elastic attack surface, where the assets themselves and their associated vulnerabilities are constantly expanding, contracting and evolving, has created a massive gap in organizations’ ability to truly understand their cyber exposure at any given time.

Another major component of today’s elastic attack surface is operational technology (OT), particularly given the growth in the risk of cyber-attacks against critical infrastructure sectors. A recent Ponemon Institute study on the state of cybersecurity in the U.S. oil and gas industry found, for example, that OT targets now comprise 30 percent of all cyberattacks. Like cloud and IoT assets, the cyber exposure gap is exacerbated by the mismatch of cyber measures deployed by critical infrastructure companies and the rapid pace of digitization in operations. Operational technologies present an additional challenge – they often can’t be assessed with the same approaches as IT assets, creating blind spots for security operations and compliance teams.

We recently announced a partnership with global engineering and technology leader Siemens that aims to address those unique risks. The product, Industrial Security from Tenable, was designed specifically for industrial control systems and will be delivered through Siemens to give energy and utilities companies full visibility into production networks to reduce compliance risk and their cyber exposure.

Both public and private organizations in every sector need to change their approach to cyber risk to effectively manage their cyber exposure. That starts with understanding and protecting what matters most across their entire attack surface. And it means looking at server and endpoint hardening, IoT discovery and hardening, container and web app vulnerability identification and OT asset and vulnerability detection.

Understanding risk and cyber exposure is also an awareness issue that should start at the top. If the C-suite and board of directors know which areas of their business are secure or exposed, that knowledge can drive strategic business decisions, including where and how much to invest to reduce risk. Attackers will always find the weak link, and right now there are too many weak links – even more than companies are aware of.

This year alone, there were several high-profile, large-scale cyber-attacks, including the NotPetya destructionware, CrashOverride/Industroyer threats to critical infrastructure, and the Reaper IoT botnet. No organization wants to experience one of these security headlines firsthand, which claimed millions of dollars in company damage and compromised sensitive customer data. Only with a holistic approach that starts with basic cyber hygiene – visibility to identify all assets and their vulnerabilities – can companies secure today’s complex attack surface.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

NVTC’s newest guest blog post from Exostar explains why new government regulations are giving organizations a fresh concern when it comes to cybersecurity. Exostar’s Senior Vice President of Product Development Vijay Takanti will be part of the panel discussion, NIST 800-171: Is the Government Paving the Way for Commercial Security? at the 2017 Capital Cybersecurity Summit November 14-15.


exostar v2Cybercrime is on the rise, and could cost businesses over $2 trillion by 2019. These losses could be the result of outright theft, lost productivity, impact to customer confidence or costs associated with repairing breaches. But a new, equally ominous risk associated with cybersecurity is emerging for both government contractors and downstream commercial businesses—the risk of losing current and future contracts due to non-compliance with new government standards.

Department of Defense contracts now include a clause, DFARS 252.204-7012, “Safeguarding Covered Defense Information and Cyber Incident Reporting.” The new clause requires contractors (and their extended supply chains) to implement NIST SP 800-171 cyber safeguards by December 31, 2017 – or at least have a coherent plan for doing so.

NIST SP 800-171 is a set of 110 security controls regulating the handling of sensitive (but not classified) data. Most organizations in the aerospace and defense industry are well aware of these standards and their application to the DFARS mandate by now. However, other organizations, who don’t work directly with the government, may get pulled into NIST 800-171 compliance because of the global, multi-tiered nature of prime contractors’ supply chains.

Keep in mind that the supply chain on any given project can include hundreds or even thousands of suppliers who are privy to controlled defense information (CDI). As the volume of suppliers and the information they exchange rises, the more vulnerable they are to cyber-attack and CDI compromise. Even small pieces of information need to be protected at all times.

The NIST 800-171 rules are designed to best protect this sensitive information as it moves across every level of the supply chain. If even one link in the chain is insecure, it could spell trouble for all parties participating on a government program. Officially, the government can start including NIST 800-171 compliance as a requirement for contracts once the rules are in effect. If organizations are not compliant, they will not be able to bid on those contracts, and existing contracts could be in jeopardy.

Organizations that are not compliant with these new cybersecurity controls run the risk of losing out on business, as primes and larger suppliers select preferred vendors who can demonstrate proper cybersecurity hygiene.

The deadline is looming. Mitigate the latest cybersecurity risk by understanding and implementing the NIST 800-171 security controls now, or find a qualified partner to help you do so.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

The Myth of Cloud Insecurity

October 31st, 2017 | Posted by Alexa Magdalenski in Capital Cybersecurity Summit | Cloud | Guest Blogs - (Comments Off)

Telos Corporation CEO and Chairman of the Board John Wood addresses cloud security in his new guest blog. Wood will be moderating the State of Cloud Security and Compliance panel at the Capital Cybersecurity Summit on Nov. 14-15 at The Ritz-Carlton, Tysons Corner.


John-WoodIt’s not exactly clear when the term “cloud” was first used to describe shared pools for configurable IT resources. However, it’s safe to say that it started creeping into our lexicon less than ten years ago.

Back then, the official definition of cloud was even less clear than it is today. Regardless of what the cloud actually was, this mysterious cloud entity was widely assumed to be unsafe.

That said, even from the beginning, I saw that the cloud offered many security advantages, especially to smaller companies that couldn’t afford to make infrastructure investments and hire many highly-skilled staff to manage complex IT systems in their own on-premises data centers. Still, doubts about cloud security swirled.

But in 2014, a crazy thing happened. Defying conventional wisdom, the CIA, arguably the most security conscious organization in the world, announced their plan to work with Amazon Web Services (AWS) to adopt commercial cloud services. Shortly thereafter, C2S was born.

Even though countless other agencies had already adopted the cloud by 2014 – the CIA and C2S gave the cloud instant credibility. It made federal agencies and highly-regulated commercial organizations realize that if cloud technology is good enough, and secure enough for the CIA, then it must be secure enough for them. Granted, the C2S is an isolated environment, it was noteworthy that CIA made the often trumpeted “cloud first” policy a reality.

AWS recognized early on that security was important to ensure continued, widespread adoption of cloud services. For this purpose they introduced a shared responsibility model to help explain the security benefits you derive simply by hosting your workloads within AWS. Under this model, the customer is responsible for security in the cloud, and AWS is responsible for security of the cloud.

Not only does this shared responsibility model help address a number of security questions, especially in the areas of infrastructure and physical security, it also helps clients demonstrate compliance requirements more quickly and efficiently, because they can inherit results directly from AWS.

AWS certainly isn’t the only cloud service provider (CSP) in the game – Azure and Google also understand how important the message of cloud security and compliance is to drive further cloud adoption.

Despite all of this it is essential for organizations to understand the potential security pitfalls of cloud adoption. It’s essential to know where your cloud service provider responsibility stops and customer responsibility starts. There have been a number of recent breaches resulting from unsecured cloud-based database deployments. Customers need to understand, and take seriously, their responsibility in protecting their systems, their applications and their data.

The cloud has come a long way over the last ten years. Much progress has been made to enhance security and promote these security and compliance benefits. However, there is still work to be done to address lingering security concerns, questions and perceptions to help drive even broader adoption of cloud services.

If you’d like to hear what CSPs have to say about the myth of cloud insecurity, join me on Wednesday, November 15 at NVTC’s Capital Cybersecurity Summit. I will be moderating a panel that will discuss the current state of cloud security and compliance, featuring prominent voices from the big three cloud providers: Google, Microsoft and AWS. I hope to see you there!

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

The world’s total digital data volume is doubling in size every two years, prompting organizations to find new ways to secure their complex data. In their new NVTC member blog, LMI provides tactics for determining cybersecurity threats in your organization’s digital supply chain and securing critical data.


LMILogoThe world’s total digital data volume is doubling in size every two years, and by 2020 will contain nearly as many digital bits as there are stars in the universe. Most of this data is created and communicated over the Internet, whose “population grew by more than 750 percent in the past 15 years to over 3 billion. This population shares more than 2.5 million pieces of content on Facebook, tweets more than 300,000 times, and sends more than 204 million text messages—every minute.”

With the advent of the Internet of Things and other innovative technology platforms, organizations must continuously analyze and secure their complex data. For supply chain operations, digitalization has enabled leaders to access data faster and build stronger connections within a given supply chain. While there are clear benefits of the digital supply chain, there are challenges that need to be overcome in order to realize its full potential.

According to Ernst & Young, complex data presents numerous challenges to supply chains:

  • The volume of data is skyrocketing as diverse data sources, processes and systems show unprecedented growth. Companies are trying to capture and store everything, without first establishing the data’s business utility.
  • The fact is, technology is enabling this proliferating data complexity—continuing to ignore the need for an enterprise data strategy and information management approach will not only increase “time to insight,” but it may actually lead to incorrect insights.

LMI IMagePerhaps, none of these challenges is as critical as an organization’s ability to successfully secure its supply chain data given the IT security risks posed by the Internet. In fact, 30 percent of supply chain professionals are “very concerned” about a data breach.” The concerns of these professionals are well-founded. The number of cybersecurity breaches is growing by 64 percent every year with 60 percent of cyber breaches linked to insiders—current and former employees, contractors, service providers, suppliers and business partners.

Unfortunately, many organizations are unaware of the security vulnerabilities within their supply chain or how to determine those vulnerabilities. To help organizations determine their vulnerabilities, start by answering the following three questions:
1. How will the product be used and managed in the system? While any system breach is bad, the compromise of a system managing classified data is a much worse than a system that is managing publically available data. Understanding the use of the Information and Communication Technology (ICT) equipment will help determine the resources appropriate to secure the system. In reviewing the product use, consider what other systems are connected to the focus system. A less secure system can serve as a pathway to attack a more highly secured connected system. This was the method used to steal credit card numbers from Target in 2013.

2. How is the system connected to the rest of the world? A system that is connected to the public Internet will need more reliable security, since it would be easy to find and attack. On the other hand, a system that is isolated from any other network would have a much lower risk of attack or data breach, since the attacker would need to be in physical proximity of the system.

3. Who are the system users? Are the users internal employees who are trained on security procedures or is the system accessed by a public user base which may not consider risky security behaviors? Simple security procedures, such as keeping passwords secret and maintaining current anti-virus software, cannot be counted on if you do not directly control users’ environments.

By answering these questions, organizations could quickly and effectively determine the security vulnerabilities within their digital supply chain. Organizations can also contact our cybersecurity experts who can help you monitor, prioritize, and effectively manage your risks to create an optimal level of security based on mission priorities and resource constraints.

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

What do cloud and AI mean for human resources? Will automation replace human resource functions and associates? Read on to find out in Insperity’s newest NVTC guest blog.


insperity v2Cloud-based tech solutions for human resources offer the promise of easy installation and implementation, but does such software really eliminate the need for HR staff?

The short answer is no.

While new technical offerings can improve the efficiency and speed of many HR processes, the human touch is still needed to get the most out of the software.

For example, you’ll still need someone to “operate the machinery,” so to speak, or administer the software. In a smaller company, that may be one combination payroll and HR person. In a larger company, you may need one employee to do nothing but maintain, update and run the software so that your company gets the most from its capabilities.

When HR software works best

Technology is your friend when it comes to the tactical aspects of human resources. For instance, an online time tracking system that ties to your payroll and government reporting systems can save significant time and improve accuracy over manual tracking and handwritten reports.

Cloud-based HR software can automate formerly complex, time-consuming activities including:

  • New hire paperwork, such as the I-9 authentication and reporting of right to work in the United States
  • Storing of data for compliance
  • Tracking of critical HR data related to hours worked by project or department, turnover and more
  • Garnishments, reporting and mandatory requirements that vary by state

For example, a company operating in a big state like Texas may not be accustomed to the HR complexities of hiring across state lines. But open an office in New York, and you could have employees who work in that state but live in Connecticut or New Jersey.

HR software can help ensure your compliance with multiple states’ payroll tax requirements, and prevent you from having to learn and implement such widely disparate laws on the fly. The best-case scenario is when you have the right software in place to facilitate efficiency and compliance, with access to experienced HR professionals to guide you.

What to look for in HR software

Once you’ve decided whether an HR software package delivers the basic functions your business needs and will help drive company goals, it’s time to take a deeper dive into its functionality.

Some questions to consider:

  • What purpose will this software serve? Will it eliminate, add to or integrate with your existing systems?
  • Who will administer the software? Will they require extra training? If yes, how much? How much training is included in the price?
  • Is this software backed by HR on demand? For example, even with the best software, you’ll still have the occasional compliance question. Look for a software solution with human support.
  • Will this software integrate with other existing software for payroll, time and attendance, or enterprise resource planning (ERP)?

As you talk to software vendors, it’s vital you involve frontline workers who operate existing systems to help you evaluate any new HR software and its integration requirements. Depending on your current set-up, this may mean you bring in the payroll administrator, ERP data manager, compliance officer or the HR specialist managing the current performance system.

These are the people who can help you avoid the costly mistake of buying software that ultimately will not “play nice” with your other systems, since they know the intricate details of how your existing systems really work.

Why leadership is still needed

While cloud-based software may streamline many HR processes, there’s no substitute for sound leadership. Think “strategic” versus “tactical.”

Yes, software can help a company align its objectives and drive engagement through performance management, employee feedback mechanisms, people analytics, training, and compensation and rewards systems. But no software will ever replace a leader who communicates, inspires and motivates employees to achieve the organization’s goals.

As a business grows, it becomes harder to keep employees aligned with the company’s goals and strategies. Software can help keep your ship on the right course, but at the end of the day, any technology solution is only as good as the people behind it.

Learn more about Insperity here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

NVTC’s newest blog is by Dovel Technologies Vice President Mike Atassi. Atassi recently moderated the healthcare data analytics panel at the inaugural Capital Health Tech Summit on June 15, 2017. Scroll below for full video of the panel.


Final Logo Capital Health Tech Summit NVTC (2) Data is being generated at unprecedented levels – with more than 2.5 quintillion bytes being created every day.  Unlocking the potential value of this data will help accelerate research, develop targeted therapeutics and improve the delivery of healthcare. Today’s information and computational sciences and technologies are playing a critical role in delivering better healthcare to everyone.

Accelerating the path to discovery and finding targeted therapeutics to address some of the most chronic diseases is a promise that can be largely fulfilled with exploiting available data. Whereas primary investigation has been the most important source for generating data and discoveries, today we see how data scientists are curating existing data to make it searchable, accessible, interoperable and reusable.

A panel of experts discussed the role of data analytics in the continuum of health at the recent NVTC Capital Health Tech Summit and provided valuable lessons on how to protect, govern, and transform data into valuable information and health insights. The panelists discussed different ways to enable health data to be searchable, accessible, interoperable and reusable. Key themes from the discussion included:

Building a data-rich infrastructure: Incorporating genomic and proteomic data into clinical delivery is a challenge that is being met with innovation in technology and information architecture, transforming large, disparate data sets into consumable, actionable packages.

Utilizing advancing technologies: Deploying machine learning and predictive analytics alongside data, processes, and the workflows that already exist within hospitals can help to predict and prescribe new protocols.  For example, the use of predictive analytics and machine learning resulted in a 39 percent reduction in patient falls in just six months at a local hospital.

Improving wellness: Enabling the delivery of integrated wellness, disease management, and healthcare services to the community based on insights from data. For example, data analytics is playing a key role in improving the effectiveness and global efficiency of transfusion medicine and cellular therapeutics.

Reducing risks: Helping to prevent the spread of major diseases, such as the Zika virus, by integrating datasets from multiple sources to identify geographic risk patterns. Data also allows for the benchmarking of activities to guide decisions that will make sure that the right person gets the right treatment at the right time.

Preparing a new generation of data scientists: Bringing together interdisciplinary individuals with domain and technology expertise to develop leading public health and precision medicine professionals. Today, many institutions of higher education are offering advanced degrees in data sciences – combining the knowledge of biological sciences with computational and mathematical sciences to provide a generation of data scientist capable of unlocking values hidden in large and complex data systems. Data scientists today are already showing tremendous progress in biomedical computing in terms of developing meaningful solutions for analytics, visualization, as well as data management and governance.

With these advances, real challenges remain with limitations raised by ethical, legal, procedural and even technological constraints. In order to successfully meet these challenges, the industry must use a sound foundation of proven techniques and processes to ensure predicable results. However, the continued convergence and collaboration of biomedical sciences and technologies – along with increased demand for precision healthcare – will provide the impetus to meet these challenges and deliver real breakthroughs for better health.

View full video from the panel:

 

The panel discussion was moderated by Mike Atassi, Vice President, Dovel Technologies and was comprised of Aaron Black, Chief Data Officer, Inova Translational Medicine Institute; Dr. Abigail Flower, Lecturer, Department of Systems and Information Engineering, University of Virginia; Chris Ghion, Vice President and Chief Information Officer, Adventist HealthCare; and Dr. Barbee I. Whitaker, Senior Director, Department of Research, AABB. 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

In Asurion’s new member blog post, Senior Vice President of Retail Application Delivery and Voice Services Sean Nass discusses embracing new product development models rooted in collaboration and focused on outcomes. A global leader of connected life services, Asurion partners with leading wireless carriers, retailers and pay-tv providers to provide consumers with protection and premium tech help  supporting mobile phones, consumer electronics and home appliances. 


644-012 D15b FINAL Asurion LogoThere is a big shift in how technology companies are going about business.

Many are pivoting from a process-and project-based strategy to one that is more forward-thinking. Team members are crossing boundaries, blurring the lines of previous structures and coming together in the spirit of collaboration to deliver outcomes that exceed customer expectations.

In the old model of business, the technology focus was on delivering large projects using project managers and a pool of resources, defining and limiting capacity. Instead of focusing on an outcome, teams would get together and create big requirement documents with minute details that would bog down capacity, forcing a project through months of work and still frequently achieving a result that was somehow different from how it was initially envisioned. Opportunity costs were often lost under this old model of product development.

Now businesses are pivoting, with a more forward-thinking attitude in mind. At Asurion, we recently built what we call journey teams, individuals across key sectors of our business who come together to optimize the speed with which a project achieves its desired outcome and experience for consumers. As part of this shift, we merged product, design, technology and customer experience teams to optimize the process with the project’s outcome in mind rather than focus on the process itself. The days of separate “product” and “IT” silos are behind us. We’ve combined product, design and technology teams and have empowered them to ask the question “How do we focus on what’s best for the consumer experience?”

Take the claim process as an example. Under previous models, a customer’s claim would pass through various workflows, often with redundant or unnecessary steps that may not have been a great experience for the customer. Under our journey team model, we dedicated product, technology and design leads to focus on an outcome that equates to a positive customer experience. This mentality leads to faster time to market and less waste in resource capacity, and allows our team members the ability to innovate in a rapid fashion. More importantly, the customer has a really positive experience.

We don’t tell our journey teams what to do or how to do it – instead, they innovate and test ideas and are empowered to make decisions on their own, all with this singular goal of improving consumer experience. The journey teams put together a vision based on a desired outcome, a vision that nails down what is going to work and what isn’t to drive improvements in speed, reliability and efficiency of a product’s delivery.

The shift has opened up new channels of communication and new ways of interacting across teams, even to the point of how we collocate in our workspace. We have seen a radical change in the quality of our intercommunications because people are developing prototypes, conducting tests and not working off huge requirement documents.

Our goal is to create a seamless integration of product, technology and design that optimizes the experience for our customers.

It hasn’t all been easy, but progress doesn’t occur without change. We certainly can’t transform all teams into this model at once. However, with patience and modeling teams’ successes, we are seeing increases in quality and speed, and the enthusiasm of the teams is amazing. They are so engaged because they see a direct correlation in their work and how it dramatically improves a customer interaction. There’s more alignment among product, technology and client services than we’ve ever seen before. If you are thinking of trying something similar in your offices, I recommend forming a shared goal, a shared alignment across all teams and placing the focus on future growth. Your efficiency and product development and delivery will improve, and that’s what everyone is looking for, after all.

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

In their new NVTC member blog post, Alvarez & Marsal Taxand discusses how companies in the tech industry can prepare for proposed tax reforms that may be implemented in the near future. Alvarez & Marsal Taxand, an affiliate of Alvarez & Marsal (A&M), a leading global professional services firm, is an independent tax group made up of experienced tax professionals dedicated to providing customized tax advice to clients and investors across a broad range of industries. Alvarez & Marsal Taxand is a founder of Taxand, the world’s largest independent tax organization, which provides high quality, integrated tax advice worldwide. 


AM_Taxand_Logo_Wordmark_color (2)Under the House Republican “Blueprint for Tax Reform” (the Blueprint), companies would be able to deduct interest expense against interest income, but no current deduction would be allowed for net interest expense. Any net interest expense would be carried forward indefinitely and allowed as a deduction against net interest income in future years. In addition, the proposed reduction of U.S. tax rates may also reduce the value of U.S. interest deductions. These proposals should impact decisions right now around multinational intercompany financing structures for tech companies, as well as other aspects of their intragroup contractual arrangements.

Until now, the high U.S. corporate income tax rate of 35 percent has created an environment that favors foreign related-party lending to U.S. affiliates, particularly when the loan is advanced from a low-tax jurisdiction. U.S. taxable income may be reduced via an interest deduction and the corresponding interest income may be captured in the lower tax jurisdiction. Alternatively, tax considerations may have made it desirable to incur third-party debt in a U.S. group company, rather than in lower-taxed group companies. The feasibility and/or desirability of these sorts of “earnings stripping” benefits would be greatly diminished by the Blueprint.

So, how are forward-looking companies, particularly in the tech industry, preparing for these potentially dramatic changes? We are seeing a number of them explore the following questions:

1. Should the debt level of U.S. group companies be reduced and, if so, how?
2. Should the interest rate be reduced on intragroup debt financing of U.S. group companies?
3. Can we replace debt financing with other forms of financing arrangements that may yield deductions other than interest expense for the U.S. company (e.g., rent expense on sale / leaseback transactions, royalty expense on intellectual property (IP) licensing transactions)?
4. Should U.S. group companies make interest-bearing loans to other group companies that can benefit from interest deductions in their countries, thereby creating interest income in the U.S., against which the U.S. company could then deduct its own interest expense (e.g., should a U.S. company be a group finance company)?
5. Can lost interest deductions be replaced by more aggressive transfer pricing for other intragroup transactions (e.g., the intragroup purchase and/or sale of goods or services)?

All of these questions regarding intragroup transactions have important transfer pricing implications. For most intragroup transactions (other than those rare instances when the comparable uncontrolled price method is the best method), the prevailing transfer pricing theory permits a range of choices for the intercompany transfer price. So, whether the decision relates to the level of U.S. indebtedness, the substitution of interest expense with other types of deductions, or the creation of interest income in the U.S., the after-tax impact of those decisions can be significantly enhanced by proactive transfer pricing planning. This is true regardless of whether the objective is the more traditional one of minimizing taxable income in the U.S., or a new one to increase taxable income in the U.S. (with the offsetting decrease in other countries with higher tax rates) in light of dramatically changed U.S. tax rules. Our international tax and transfer pricing specialists can help your company to determine the most desirable course of action and to substantiate an appropriate / defensible range of choices for intercompany prices that will yield the optimal results.

Visit A&M Taxand’s Tax Advisor Minute for more helpful insights for executives in the technology sector.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week’s member blog post is from Tangible Security Executive Chairman and CEO Dr. Mark Mykityshyn. Tangible Security employs the most sophisticated cybersecurity tools and techniques available to protect clients’ sensitive data, infrastructure and competitive advantage. Dr. Mykityshyn discusses the current regulatory climate around drones and unmanned aircraft systems and the need for new policies to fuel market growth in the industry.


Undoubtedly, drones and unmanned aircraft systems (UAS) are a very hot topic these days and their technology, business, policy and cybersecurity implications continue to rapidly expand and evolve.

Tangible Security recently participated in a roundtable meeting in Washington, D.C. that engaged thought leaders and stakeholders from aerospace and aviation, academia, Congress, government and related industry organizations. The group shared ideas, explored and challenged assumptions, and discussed policy positions and current practices in drone/UAS.

The roundtable was organized by ADS Infrastructure Partners (ADS) as part of a national campaign to help fund and establish the Drone/Unmanned Aircraft Systems Regulatory Association (DURA), the first step in unlocking the full economic value of the sector.

Roundtable conferees widely acknowledged that development of the drone/UAS commercial market is constrained, in great part, due to the existing FAA regulatory environment and the slow pace of rulemaking and certification. The group recognized that drone/UAS sector regulation requires urgent streamlining to realize full market potential, economic growth and jobs.

According to FAA’s recent market forecast, sales of UAS for commercial purposes are expected to grow from 600,000 in 2016 to 2.7 million by 2020. Industry experts have recognized that this growth, and the billions of dollars at stake, may not materialize without overhauling the current regulatory model.

Conferees also agreed that the immediate next step is to explore the pros and cons of drone industry regulation through delegation of FAA authority mandated by Congressional legislation, and to develop a blueprint for the new organization. The creation of DURA, an archetype of an industry-led public-private partnership, is an idea whose “time has come,” according to many roundtable attendees.

According to Jim Williams, head of JHW Unmanned Solutions, and most recently the Manager responsible for the FAA’s Unmanned Aircraft Systems Integration Office, “The future of unmanned aircraft operations depends on finding new ways to manage the airspace and regulate the operators. Forming a delegated organization to manage the airspace, approve the vehicles, and oversee the operators is the key to opening up this extremely valuable new segment of aviation.”

To expand this dialog nationwide, ADS will hold a National Summit in Washington, D.C. in September 2017 where leaders who represent more than five hundred businesses, agencies, associations, customers and stakeholders will assemble.

If you or your organization is interested in participating in DURA or attending the National Summit in September, please don’t hesitate to email me. All members of the technology, aviation and business community are invited to attend.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Did you know nearly 90% of all successful ransomware attacks were on hospitals in 2016? In his guest blog, Ostendio CEO and Co-Founder Grant Elliott sheds light on the cybersecurity implications of healthcare today and the importance of engaging healthcare employees in cybersecurity. Elliott will be speaking on the Cybersecurity Panel at the Capital Health Tech Summit taking place on June 15, 2017 at the Inova Center for Personalized Health.


Ostendio Logo-01Why is healthcare so heavily and successfully targeted by cybercrime? After a record number of breaches last year – nearly 90% of all successful ransomware attacks were on hospitals – it’s one that needs to be asked.

Cybercriminals target healthcare data because hospitals need immediate access to up-to-date patient information in order to provide critical care. When malware enters the system, it prohibits access to data, and in turn, prevents hospital staff from efficiently and effectively treating a patient. The cybercriminals then demand a ransom, usually in the form of Bitcoins. Ransomware is growing in popularity because it works. In 2014 alone, the FBI estimates that the minds behind the CryptoLocker strain of ransomware received nearly $27 million in six months out of data taken hostage.

When MedStar Health, a health system serving the Baltimore/Washington region, was hit by a cyberattack in 2016, they choose not to pay the Bitcoin ransom, instead choosing to shut every aspect of MedStar Health’s electronic medical record systems off.

Hospitals are also a prime target because employees aren’t always trained on security awareness. While HIPAA aims to ensures that patient privacy is protected, in general, hospitals do not place a big enough emphasis on the importance of cybersecurity. Protecting data has always been a challenge, but an aware and invested workforce can become your company’s first line of defense.

So, what can be done to try and reduce the number of data breaches?

Look to your employees. Employees are an organization’s greatest asset, and they need to be treated as such. It takes just one click on a malicious link to bring a whole system down. Make sure that each and every employee understands their role in a cybersecurity program. They need to know where data is, when they should access it, how it should be used and how it’s being protected. Only then can they can become your front line of cyber defense.

Learn more about Ostendio here and check out the latest Capital Health Tech Summit agenda!

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS