Is your organization a startup or in its early stages? In their new member blog, Alvarez & Marsal Taxand shares opportunities around the 2018 Qualified Small Business (QSB) R&D Tax Credit for your small business. Read on to learn more!


AM_Taxand_Logo_Wordmark_color (2)Last year, many early-stage companies significantly reduced payroll taxes thanks to a new federal tax credit including many of our startups in the tech industry. The Qualified Small Business (QSB) R&D Tax Credit, passed into law in 2015, allows qualified small businesses to offset OASDI (i.e. social security) taxes with R&D tax credits originally claimed on federal income tax returns. The 2018 calendar year presents yet another opportunity for companies to realize these savings. How are these credits achieved?

1. Assess Eligibility – Companies must have less than $5MM of revenues and no revenues prior to 2013. The credit is most applicable to companies with early and high expenditures in R&D before starting to generate revenue. Common examples include companies in deep technology, biotech & life sciences, and manufacturing. Further, companies with at least $300K in annual payroll are most-likely to realize a material benefit.

2. Plan Ahead – Companies seeking QSB R&D Credits this year must first claim the credit on 2017 federal income tax returns. Since a company may not amend an income tax return for the credit, it is important that companies interested in this credit NOT file federal income tax returns without first assessing whether or not they qualify for the credit. Remember, savings cannot be realized until one quarter AFTER the income tax return is filed. Thus, if you’re planning on taking advantage of the QSB Credit opportunity DO NOT WAIT! 2017 Returns must be filed before March 31st, 2018 if you want to bank these savings by the second quarter of 2018.

3. Seek Assistance – While QSB Credits often produce significant savings, they are not a “free lunch.” Not only do companies without experience in this area run the risk of leaving savings on the table, but they could also incur interest and/or penalties related to payroll tax mistakes. Reach out to an advisor for assistance in assessing if QSB Credits make sense for your business.

Our team at Alvarez & Marsal Taxand is ready to assist any company interested in these savings. For more information, please click here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

NVTC’s latest guest blog post is by Michael J. Redding, Managing Director – Strategic Technology Innovation, Accenture Ventures. Accenture is a professional services firm that answers real business challenges for their clients through innovation and deep industry knowledge. Find out how on March 22, 2018 when Redding presents Accenture’s 2018 Technology Vision trends at a special networking event with NVTC and Accenture.


By Michael J. Redding

Accenture-Marquee-TV18 (1)

We’re moving into a time where continuous technology innovation and acceleration is the new normal. Technology is reshaping our daily lives, and it’s shaping society as a whole.

For companies to weave themselves into the fabric of peoples’ lives requires a new level of trust. It’s not just business. It’s personal. And leading companies are seeing strong correlations between trust and growth. They’re thinking beyond their direct business goals, to how they connect with their communities and the people who live in them.

Accenture’s annual Technology Vision looks at the top trends that will have the greatest impact on enterprises over the next 1-3 years. This year’s report, themed Intelligent Enterprise Unleashed, looks at the following trends that leverage disruptive technology to build the types of connections being sought by customers, citizens, employees, and business partners. The type of connections that can lead to business growth.

Trend #1: Citizen AI: By “raising” AI responsibly, businesses will create a collaborative, powerful new member of the workforce.

Trend #2: Extended Reality: Immersive experiences are changing how we connect with people, information, and experiences.

Trend #3: Data Veracity: Businesses must adapt existing capabilities to combat a new kind of vulnerability: inaccurate, manipulated, and biased data that leads to corrupted business insights.

Trend #4: Frictionless Business: To fully power the external technology-based partnerships they depend on for growth, companies must first move on from internal legacy systems and re-architect themselves.

Trend #5: Internet of Thinking: Enabling intelligence for the next generation of technology demands an overhaul of existing infrastructures.

TechVisionDemo-Dallas-3Bringing the Accenture 2018 Technology Vision to life through engaging demos. 

 

Technology is the enabler to a bigger role and better partnerships for your Intelligent Enterprise–and a brighter future for us all.

Do you know how you will reimagine and reinvent your business? How will you use these trends to drive growth and differentiation? How will you partner with people, and what expanded responsibility do you have in society? Join me on March 22nd for a deeper dive into these trends, hands-on demos, and discussion about the new opportunities technology can bring.

Mike Redding-DrDemo-Dallas Caption: Michael Redding and Ben Holfeld a.k.a. “Dr. Demo” of Accenture, both will be at NVTC and Accenture’s March 22 event at the Center for Innovative Technology in Herndon. 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

By Nathan Self, Research Associate, Department of Computer Science, and faculty at Discovery Analytics Center, Virginia Tech. Self will be participating on the Opportunities, Challenges and Future Trends in Advanced Analytics panel at the second annual Capital Data Summit on Feb. 28, 2018.

VT DAC 2Imagine your biggest spreadsheet. Too many rows and columns to take in at once. At the Discovery Analytics Center (DAC) at Virginia Tech we are interested in how humans and machines can work together to make sense out of all that data. Andromeda is an example of how analysts can combine sophisticated machine learning algorithms with interactive visualization to get insights from their data.

Let’s assume that your enormous spreadsheet has a row for every customer and a column for every statistic you keep about each customer. Andromeda draws a scatterplot in which each point represents a customer. Points that are close to each other represent rows that are similar to each other. Likewise, distant points represent dissimilar rows. To begin with, Andromeda assumes that each column is equally important.

This is where the human aspect comes in. You know what the data actually represents and you can interact with Andromeda in two ways. You can either (1) change the importance of a column. The points of the scatterplot will regroup to preserve the “near is similar” constraint. Or, (2) you can reach right into the scatterplot and move points closer or farther from each other and Andromeda will compute which columns have to be important for them to be considered similar.

Andromeda, with new algorithms and new paradigms of user-algorithm interaction,  serves as a good example that complex statistical methods do not necessitate complex user interfaces or expert users. The selling point of Andromeda is that you don’t have to know that Andromeda uses an algorithm called weighted multidimensional scaling to lay out the scatterplot. You don’t have to know that data scientists at DAC developed inverse multidimensional scaling to handle interacting with the points. In fact, users have effectively generated insights despite having no experience with these, or similar, algorithms. And, their insights are more complex than when they use spreadsheets alone.

There are countless pivotal statistical processes that are well-suited and useful for current data analysis needs. Any of these would be well-served by intuitive interfaces for people that are not experts in statistics.

There is an adage that the same statistics can be used to justify either side of an argument. Machine learning has the same malleability. Understanding exactly what a machine learning tool like Andromeda can do for you — as well as its limitations — is important for deciding what to do with its outputs.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

The chip vulnerabilities announced two weeks ago affect almost every single PC, Mac, laptop, tablet, and smartphone created in the last 20 years. Passwords, personal information and any secure information on a device are at risk.

LMILogoHow It Works
This malware affects the “kernel” or core of the operating system, which acts as a bridge between the hardware and software of a machine. The core handles everything from typing and clicking to opening and running applications like web browsers and Microsoft Outlook. The core provides each process with the resources it needs to function and keeps the processes isolated.

When exploited, this security flaw allows an attacker to subvert this isolation and read all of the protected data on a computer. These data could include anything from passwords to personally identifiable information from your tax program. These attacks can be especially damaging for cloud services because they run shared setups where users share hardware but are isolated by software. By hacking one user’s cloud instance, hackers can use these security flaws to see all the data on the shared hardware. Fortunately, the security flaw is difficult to exploit, and attackers must compromise a machine before they can exploit these chip vulnerabilities.

Cyber security concept.What’s Being Done?
Companies like Intel, Apple, Google, and Microsoft have released patches to defend against these security flaws. The patches further isolate the core’s memory but may degrade performance by as much as 30 percent. Most vendors report that general computer users will not see such a large decrease in performance.

LMI is implementing a remediation plan to patch all vulnerable systems. We will test the patches against a pilot group of machines before releasing them to the rest of the organizational ecosystem. This will allow our team to identify and address potential issues with the patches to maintain LMI operations as the patches are implemented.

To protect your personal systems, install patches and operating system updates as soon as they are released, and make sure your web browsers are up to date. Most browsers automatically update, but it is beneficial to verify that they have been patched.

Looking Ahead
As an organization directly involved in the cyber space, LMI is aware that security flaws and exploits will continue to be a concern. This specific cyber scare is far less concerning than the number of security vulnerabilities we saw in 2017. We expect to see even more vulnerabilities in 2018 because of the evolving nature of hackers and the spread of far-reaching security flaws. Our team will continue to adapt to the ever-changing cyber threat landscape as threat actors change their tools and techniques.


Jonathan Stammler is the Information Security Manager for the Enterprise Technology Services group at LMI. He received an MS in information security from Georgia Institute of Technology and a BS in information technology from George Mason University. If you’d like more information on how LMI can assist your organization with its cybersecurity needs, please email Jonathan.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Gartner predicts there will be an estimated 8.4 billion IoT devices by 2020. Tenable President, Chief Operating Officer and Co-Founder Jack Huffard discusses how the proliferation of digital assets and connected devices are creating an exposure gap in cyber defenseand shares how organizations can fight back against cyber-attacks. Huffard participated on the Successful Cybersecurity Growth Companies In the Region panel at the Capital Cybersecurity Summit on Nov. 15, 2017.


jack-huffard-2015-2-webIt’s been more than two years since the Office of Personnel Management (OPM) disclosed one of the largest data breaches in history, but just last week, the agency’s inspector general gave them a failing grade when it comes to critical areas like risk management and contingency planning.

In addition, the data breaches and attacks we’ve recently seen across a variety of industries, including entertainment, critical infrastructure, retail and finance, make it clear that all organizations are still failing when it comes to basic cyber hygiene.

Today, a company’s assets range not just from laptops to servers, but include mobile devices, internet-connected appliances and the cloud. The latest research shows the number of these assets are only going to increase. For example, Gartner predicts there will be an estimated 8.4 billion IoT devices by 2020. And according to a 2016 IDG Enterprise Cloud Computing Survey, 70 percent of organizations already have apps in the cloud and 16 percent more will in 12 months. This modern, elastic attack surface, where the assets themselves and their associated vulnerabilities are constantly expanding, contracting and evolving, has created a massive gap in organizations’ ability to truly understand their cyber exposure at any given time.

Another major component of today’s elastic attack surface is operational technology (OT), particularly given the growth in the risk of cyber-attacks against critical infrastructure sectors. A recent Ponemon Institute study on the state of cybersecurity in the U.S. oil and gas industry found, for example, that OT targets now comprise 30 percent of all cyberattacks. Like cloud and IoT assets, the cyber exposure gap is exacerbated by the mismatch of cyber measures deployed by critical infrastructure companies and the rapid pace of digitization in operations. Operational technologies present an additional challenge – they often can’t be assessed with the same approaches as IT assets, creating blind spots for security operations and compliance teams.

We recently announced a partnership with global engineering and technology leader Siemens that aims to address those unique risks. The product, Industrial Security from Tenable, was designed specifically for industrial control systems and will be delivered through Siemens to give energy and utilities companies full visibility into production networks to reduce compliance risk and their cyber exposure.

Both public and private organizations in every sector need to change their approach to cyber risk to effectively manage their cyber exposure. That starts with understanding and protecting what matters most across their entire attack surface. And it means looking at server and endpoint hardening, IoT discovery and hardening, container and web app vulnerability identification and OT asset and vulnerability detection.

Understanding risk and cyber exposure is also an awareness issue that should start at the top. If the C-suite and board of directors know which areas of their business are secure or exposed, that knowledge can drive strategic business decisions, including where and how much to invest to reduce risk. Attackers will always find the weak link, and right now there are too many weak links – even more than companies are aware of.

This year alone, there were several high-profile, large-scale cyber-attacks, including the NotPetya destructionware, CrashOverride/Industroyer threats to critical infrastructure, and the Reaper IoT botnet. No organization wants to experience one of these security headlines firsthand, which claimed millions of dollars in company damage and compromised sensitive customer data. Only with a holistic approach that starts with basic cyber hygiene – visibility to identify all assets and their vulnerabilities – can companies secure today’s complex attack surface.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

The world’s total digital data volume is doubling in size every two years, prompting organizations to find new ways to secure their complex data. In their new NVTC member blog, LMI provides tactics for determining cybersecurity threats in your organization’s digital supply chain and securing critical data.


LMILogoThe world’s total digital data volume is doubling in size every two years, and by 2020 will contain nearly as many digital bits as there are stars in the universe. Most of this data is created and communicated over the Internet, whose “population grew by more than 750 percent in the past 15 years to over 3 billion. This population shares more than 2.5 million pieces of content on Facebook, tweets more than 300,000 times, and sends more than 204 million text messages—every minute.”

With the advent of the Internet of Things and other innovative technology platforms, organizations must continuously analyze and secure their complex data. For supply chain operations, digitalization has enabled leaders to access data faster and build stronger connections within a given supply chain. While there are clear benefits of the digital supply chain, there are challenges that need to be overcome in order to realize its full potential.

According to Ernst & Young, complex data presents numerous challenges to supply chains:

  • The volume of data is skyrocketing as diverse data sources, processes and systems show unprecedented growth. Companies are trying to capture and store everything, without first establishing the data’s business utility.
  • The fact is, technology is enabling this proliferating data complexity—continuing to ignore the need for an enterprise data strategy and information management approach will not only increase “time to insight,” but it may actually lead to incorrect insights.

LMI IMagePerhaps, none of these challenges is as critical as an organization’s ability to successfully secure its supply chain data given the IT security risks posed by the Internet. In fact, 30 percent of supply chain professionals are “very concerned” about a data breach.” The concerns of these professionals are well-founded. The number of cybersecurity breaches is growing by 64 percent every year with 60 percent of cyber breaches linked to insiders—current and former employees, contractors, service providers, suppliers and business partners.

Unfortunately, many organizations are unaware of the security vulnerabilities within their supply chain or how to determine those vulnerabilities. To help organizations determine their vulnerabilities, start by answering the following three questions:
1. How will the product be used and managed in the system? While any system breach is bad, the compromise of a system managing classified data is a much worse than a system that is managing publically available data. Understanding the use of the Information and Communication Technology (ICT) equipment will help determine the resources appropriate to secure the system. In reviewing the product use, consider what other systems are connected to the focus system. A less secure system can serve as a pathway to attack a more highly secured connected system. This was the method used to steal credit card numbers from Target in 2013.

2. How is the system connected to the rest of the world? A system that is connected to the public Internet will need more reliable security, since it would be easy to find and attack. On the other hand, a system that is isolated from any other network would have a much lower risk of attack or data breach, since the attacker would need to be in physical proximity of the system.

3. Who are the system users? Are the users internal employees who are trained on security procedures or is the system accessed by a public user base which may not consider risky security behaviors? Simple security procedures, such as keeping passwords secret and maintaining current anti-virus software, cannot be counted on if you do not directly control users’ environments.

By answering these questions, organizations could quickly and effectively determine the security vulnerabilities within their digital supply chain. Organizations can also contact our cybersecurity experts who can help you monitor, prioritize, and effectively manage your risks to create an optimal level of security based on mission priorities and resource constraints.

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

What do cloud and AI mean for human resources? Will automation replace human resource functions and associates? Read on to find out in Insperity’s newest NVTC guest blog.


insperity v2Cloud-based tech solutions for human resources offer the promise of easy installation and implementation, but does such software really eliminate the need for HR staff?

The short answer is no.

While new technical offerings can improve the efficiency and speed of many HR processes, the human touch is still needed to get the most out of the software.

For example, you’ll still need someone to “operate the machinery,” so to speak, or administer the software. In a smaller company, that may be one combination payroll and HR person. In a larger company, you may need one employee to do nothing but maintain, update and run the software so that your company gets the most from its capabilities.

When HR software works best

Technology is your friend when it comes to the tactical aspects of human resources. For instance, an online time tracking system that ties to your payroll and government reporting systems can save significant time and improve accuracy over manual tracking and handwritten reports.

Cloud-based HR software can automate formerly complex, time-consuming activities including:

  • New hire paperwork, such as the I-9 authentication and reporting of right to work in the United States
  • Storing of data for compliance
  • Tracking of critical HR data related to hours worked by project or department, turnover and more
  • Garnishments, reporting and mandatory requirements that vary by state

For example, a company operating in a big state like Texas may not be accustomed to the HR complexities of hiring across state lines. But open an office in New York, and you could have employees who work in that state but live in Connecticut or New Jersey.

HR software can help ensure your compliance with multiple states’ payroll tax requirements, and prevent you from having to learn and implement such widely disparate laws on the fly. The best-case scenario is when you have the right software in place to facilitate efficiency and compliance, with access to experienced HR professionals to guide you.

What to look for in HR software

Once you’ve decided whether an HR software package delivers the basic functions your business needs and will help drive company goals, it’s time to take a deeper dive into its functionality.

Some questions to consider:

  • What purpose will this software serve? Will it eliminate, add to or integrate with your existing systems?
  • Who will administer the software? Will they require extra training? If yes, how much? How much training is included in the price?
  • Is this software backed by HR on demand? For example, even with the best software, you’ll still have the occasional compliance question. Look for a software solution with human support.
  • Will this software integrate with other existing software for payroll, time and attendance, or enterprise resource planning (ERP)?

As you talk to software vendors, it’s vital you involve frontline workers who operate existing systems to help you evaluate any new HR software and its integration requirements. Depending on your current set-up, this may mean you bring in the payroll administrator, ERP data manager, compliance officer or the HR specialist managing the current performance system.

These are the people who can help you avoid the costly mistake of buying software that ultimately will not “play nice” with your other systems, since they know the intricate details of how your existing systems really work.

Why leadership is still needed

While cloud-based software may streamline many HR processes, there’s no substitute for sound leadership. Think “strategic” versus “tactical.”

Yes, software can help a company align its objectives and drive engagement through performance management, employee feedback mechanisms, people analytics, training, and compensation and rewards systems. But no software will ever replace a leader who communicates, inspires and motivates employees to achieve the organization’s goals.

As a business grows, it becomes harder to keep employees aligned with the company’s goals and strategies. Software can help keep your ship on the right course, but at the end of the day, any technology solution is only as good as the people behind it.

Learn more about Insperity here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Did you know ninety-nine percent of Internet traffic travels through fiber cables on the ocean floor? Or that Virginia has one of the world’s most abundant fiber optic cable networks? In NVTC’s latest blog, DFT Data Centers Vice President of Product Management and NVTC Data Center and Cloud Infrastructure Committee Leadership Board Member Vinay Nagpal discusses the fascinating world of fiber connectivity and the exciting new subsea fiber cables coming to Virginia Beach.  


Subsea Cable Ocean
Subsea fiber cables on the ocean bed

Ever since the advent of the Internet, there has been a common myth that Internet traffic travels through satellites. That is not true. Ninety-nine percent of the world’s Internet traffic travels through subsea cables that are laid on the ocean bed, like the cable shown in the picture (image, left). The oceans cover over seventy-one percent of the Earth’s surface and the explosive growth of the Internet that we are experiencing is constantly challenging us to ensure that there is adequate infrastructure in the oceans to handle the growth.

Northern Virginia, also known as the “Data Center Alley” is not only the mecca of data centers in the world, but also has the most abundant fiber optic cable network installed underneath its roads, pavements, medians and sidewalks. This has resulted in an astounding statistic: upwards of seventy percent of the world’s Internet traffic passes through Northern Virginia. It is striking to note that up until now, all of that traffic when it leaves the eastern seaboard of the U.S., travels either north making its way to New York or New Jersey, or south, making its way to the Miami area, where landing stations exist connecting the land to ocean to get to the outside world. This land-ocean-land connection happens by virtue of subsea cables that are connecting two sea ports between countries, and often, between different continents altogether.

Connected World
The world at your fingertips

Why are subsea cables important? They are important because of the unimaginable growth of the Internet, and the way the Internet has been intertwined into our lives. The use of the Internet from wearable tech, to autonomous cars, to Internet-enabled toasters and refrigerators, is just the beginning and we have barely scratched the surface in the potential adoption of the Internet. The Internet has drastically changed the taxi, hotel and so many more industries. It has given us the freedom and power. Corporate America is moving their IT infrastructure out of their facilities, and placing it in the hands of the shared technology czars, who are managing enterprise data and making sure it is accessible by users in a cost-effective model.

Extremely connected is the world that we live in – from WiFi at the airports, railway stations, inside airlines, on cruise ships – to fast fiber connection in our homes, we often take this WiFi enablement for granted. Without fiber cables, there would be no streaming a 4K movie from the comfort of our reclining sofas. We wouldn’t have the ability to get the content and data when we want it, where we want it and how we want it.

Fiber cable
Subsea fiber cable

Fascinatingly, these fiber cables (image, right) are thinner than human hair and about 1,000 times stronger. The light transmitted through these cables carries all of our data from one point to the other, from one city to the other, from one state to the other, from one country to the other, and from one continent to the other.

The transoceanic cables connecting continents together is not a new concept – the very first trans-oceanic was laid on the ocean bed over 150 years ago in the 1850s. There are currently over 350 subsea cables carrying Internet traffic daily on the ocean beds, and over 40 active subsea cable projects underway across the world. We are doing a great job feeding the sharks. Yes, the sharks still look forward to biting on these cables, and it remains to be a persistent problem on the ocean bed.

These cables are extremely expensive to build and operate – typically few hundred million dollars. A cable can easily cost in the vicinity of $300-$400 million dollars and take about two to three years from the concept to operational. In that time period, there is the feasibility study that is done and you need to acquire permits and licensing; as you can imagine crossing international waters involves multiple countries and their laws.

VA Beach Subcables
Virginia Beach connected to the world

For the very first time in the Commonwealth of Virginia, we are going to have a direct fiber cable crossing the Atlantic. This cable (click on image, above, to enlarge) will connect Virginia Beach to Bilbao, Spain. Co-owned by Microsoft, Facebook and Telxius (the subsea cable company, owned by Telefonica, the Spanish carrier), the cable, called MAREA (Spanish for “Tide”), when installed and operational will be the fastest cable crossing the Atlantic ocean ever. The second cable under development is BRUSA, connecting Virginia Beach to Puerto Rico to Brazil. A third project under final stages of consideration is Midgardsormen, which will connect Virginia Beach to Blaabjerg, Denmark. In addition, there are nine additional cable projects under consideration. Understandably, not all of these projects will see the day of the light (literally speaking); but by the time some of these projects become a reality, I am sure there will be additional new projects under consideration.

July 13 meeting
NVTC Data Center & Cloud Infrastructure Committee meeting on July 13, 2017

On July 13, Northern Virginia Technology Council (NVTC) held its Data Center and Cloud Infrastructure Committee meeting on the topic Subsea Cables Coming to Virginia Beach; What It Means for Virginia and the World. The distinguished panel included speakers from Telxius, AcquaComms, NxtVn, the City of Virginia Beach, with special cable samples and maps provided by TE SubCom.

Part 2 of the meeting will be held later this year and will focus on bringing the subsea capacity from Virginia Beach to Northern Virginia and other parts of the country. If you are interested in participating in Part 2 of the meeting, please contact Vinay Nagpal at vnagpal@dft.com.

You can view the presentation slides from the July 13 meeting by clicking here or viewing below.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

NVTC’s newest blog is by Dovel Technologies Vice President Mike Atassi. Atassi recently moderated the healthcare data analytics panel at the inaugural Capital Health Tech Summit on June 15, 2017. Scroll below for full video of the panel.


Final Logo Capital Health Tech Summit NVTC (2) Data is being generated at unprecedented levels – with more than 2.5 quintillion bytes being created every day.  Unlocking the potential value of this data will help accelerate research, develop targeted therapeutics and improve the delivery of healthcare. Today’s information and computational sciences and technologies are playing a critical role in delivering better healthcare to everyone.

Accelerating the path to discovery and finding targeted therapeutics to address some of the most chronic diseases is a promise that can be largely fulfilled with exploiting available data. Whereas primary investigation has been the most important source for generating data and discoveries, today we see how data scientists are curating existing data to make it searchable, accessible, interoperable and reusable.

A panel of experts discussed the role of data analytics in the continuum of health at the recent NVTC Capital Health Tech Summit and provided valuable lessons on how to protect, govern, and transform data into valuable information and health insights. The panelists discussed different ways to enable health data to be searchable, accessible, interoperable and reusable. Key themes from the discussion included:

Building a data-rich infrastructure: Incorporating genomic and proteomic data into clinical delivery is a challenge that is being met with innovation in technology and information architecture, transforming large, disparate data sets into consumable, actionable packages.

Utilizing advancing technologies: Deploying machine learning and predictive analytics alongside data, processes, and the workflows that already exist within hospitals can help to predict and prescribe new protocols.  For example, the use of predictive analytics and machine learning resulted in a 39 percent reduction in patient falls in just six months at a local hospital.

Improving wellness: Enabling the delivery of integrated wellness, disease management, and healthcare services to the community based on insights from data. For example, data analytics is playing a key role in improving the effectiveness and global efficiency of transfusion medicine and cellular therapeutics.

Reducing risks: Helping to prevent the spread of major diseases, such as the Zika virus, by integrating datasets from multiple sources to identify geographic risk patterns. Data also allows for the benchmarking of activities to guide decisions that will make sure that the right person gets the right treatment at the right time.

Preparing a new generation of data scientists: Bringing together interdisciplinary individuals with domain and technology expertise to develop leading public health and precision medicine professionals. Today, many institutions of higher education are offering advanced degrees in data sciences – combining the knowledge of biological sciences with computational and mathematical sciences to provide a generation of data scientist capable of unlocking values hidden in large and complex data systems. Data scientists today are already showing tremendous progress in biomedical computing in terms of developing meaningful solutions for analytics, visualization, as well as data management and governance.

With these advances, real challenges remain with limitations raised by ethical, legal, procedural and even technological constraints. In order to successfully meet these challenges, the industry must use a sound foundation of proven techniques and processes to ensure predicable results. However, the continued convergence and collaboration of biomedical sciences and technologies – along with increased demand for precision healthcare – will provide the impetus to meet these challenges and deliver real breakthroughs for better health.

View full video from the panel:

 

The panel discussion was moderated by Mike Atassi, Vice President, Dovel Technologies and was comprised of Aaron Black, Chief Data Officer, Inova Translational Medicine Institute; Dr. Abigail Flower, Lecturer, Department of Systems and Information Engineering, University of Virginia; Chris Ghion, Vice President and Chief Information Officer, Adventist HealthCare; and Dr. Barbee I. Whitaker, Senior Director, Department of Research, AABB. 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Common Pitfalls of Cloud Migration Planning

July 11th, 2017 | Posted by Alexa Magdalenski in Cloud | Member Blog Posts - (Comments Off)

Is your organization considering a transition to the cloud? Or is your company already making the switch? You’ll want to read this new guest blog post by Tom Tapley, senior consultant in the Systems Development group at LMI


LMILogoEvery technology wave requires people to develop new skill sets. Tomorrow’s job titles have not been invented yet. So when a government agency decides to move computing to the cloud, it sets off a chain reaction of changes for everyone in that agency who works with technology. “Moving to the cloud” may sound like a technology project, but it is just as much about training people.

In many agencies, teams of people procure and maintain servers, routers, switches and related hardware. These employees are experts in making machines run smoothly, quickly and reliably. Days are spent physically configuring servers in data centers.

With cloud computing, hands-on skill sets are no longer needed; they become the responsibility of cloud service providers. The servers, racks, and air-conditioned space, which may have been in government properties, will be empty and the space repurposed.

Now agency employees need training to monitor and manage the cloud, using scripts rather than screwdrivers. In the past, there may have been a division between those who coded and those who ran server operations. Those roles are becoming more and more integrated.

Planning for Migration with a Cloud Adoption Framework

A government agency may better prepare for cloud migration by spending more time planning. LMI has developed a Cloud Adoption Framework with four steps: Decide, Prepare, Implement, and Improve. The phase that most often is overlooked is Prepare, and it’s not difficult to see the difficulties that arise when this happens.

Signs an Agency Has Skipped Planning

Here are signs an agency needs to spend more time preparing before engaging in cloud projects:

  1. An agency only hires vendors who migrate data. Many cloud vendors have refined the process of migrating data and applications efficiently. However, if they don’t bring any expertise in enterprise architecture, they may just be moving data and applications in a piecemeal fashion, which creates system lag times as connections become more tenuous (some hosted onsite, while others are hosted in the cloud).
  2. No clear path for cloud migration. In 2010, a Cloud First policy was announced for the federal government. Many agencies tackled easier migration projects, such as switching to Google Mail. After that, they were stuck. They didn’t have a clear idea of what to migrate next and had no model for evaluating what to move or how to gauge the impact of moving different IT assets.
  3. Employee resistance. If employees fear their jobs will change or be eliminated, it is possible they will not provide the most accurate information about the necessity or benefits of the cloud. However, if it is clear employees will be supported as they shift to a new model, it is far more likely they will become allies in efforts to eliminate inefficiencies.

LMi graphic

Cloud Migration Improves IT Roles

Managing how employee skill sets will change often is not part of cloud migration planning at the enterprise level. But if employees are engaged in a change management process and it is clearly communicated how cloud will make their work more satisfying, the agency accrues major benefits.

  • Increased agility: In the past, a sudden need for increased processing power kicked off a complicated procurement process, which involved getting buy-in for budgets, as well as provisioning and cloning servers. With cloud computing, the employee runs a script to create one or a thousand new servers. If the need for increased power lasts for a short time, the employee just reduces requests for cloud services. No more physical servers take up space.
  • Less time spent on overextended systems: Most government agencies have systems running on old technology (they may even have code from the time of mainframes). Old code is wrapped in newer code, like a ball of yarn, and new systems are interacting with it. A team might want to migrate one piece to the cloud, but first must disentangle all the pieces. A project manager might estimate a cloud migration costs $25 million only to find that it is so interconnected with other systems that the true cost of the project is more like $100 million. It is critical agencies pull in employee expertise to gain a comprehensive view of systems to ensure cost effective cloud migrations. Employees often know what not to migrate, what should be shut down, and what needs to be built afresh. Most importantly, with cloud services they may focus on building new and strong applications, instead of maintaining outdated ones.
  • More in-demand skills: Learning how to manage the cloud has huge benefit for employees, since cloud-related skills are in high demand. But if agencies skip the workforce analysis piece and do not cultivate their workforce to take over cloud management, sooner or later they will find they cannot afford to hire new people with necessary IT skills.

 

Tom Tapley is a senior consultant in LMI’s Systems Development group. Since joining LMI in 1998, he has performed work for several clients including the U.S. Postal Service, GSA Public Buildings Service, GSA Federal Technology Service, U.S. Army and Defense Logistics Agency. Tapley came to LMI after nine years with the Maryland Department of the Environment, where he managed the department’s Geographic Information System and Computer Modeling Division. Tapley has an M.S .in computer systems management from the University of Maryland University College and a B.S. and M.S. from the University of Florida in physical geography. 

To learn more about cloud strategy, planning, and workforce readiness, please email ttapley@lmi.org.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS