This week on NVTC’s blog, Marty Herbert of NeoSystems Corp. shares the first in a series of tips for workflow and process automation.


Marty HIf you are an ERP user, you likely know that most applications are rich with many features that address the nuances of running projects, especially if you are a government contractor.  However, no application can address the many steps that an organization must go through to accomplish what might be seen on the surface as a simple task.

Take ‘billing’ for example. I was asked a while back to determine how to route a bill for approval, and I thought it would be a “piece of cake”. Create bill. Send to approver. Get approval. Bill is right – Send to customer. Bill is wrong – rinse and repeat.  For this article, we’ll use commonly known GovCon ERP, Deltek Costpoint, as an example.  This system is very good at the first part. If you need to create a bill, you can create bill replete with support for hours worked and costs incurred. The problem, however, is there is no nice and simple way of implementing a workflow process that will accommodate most organization’s review and approval routines within the ERP framework.  That’s not a knock against Costpoint, no ERP systems on the market adequately address this issue, especially when you magnify it by the many, many other processes, that an organization has in place to accomplish their back office routines.

Over the next six weeks we will be taking a look at several areas where workflow plays a big role and how to leverage the automation of workflows via integration with your ERP. Companies unaware of how to automate in these areas are wasting precious time in determining the process, missing steps and ultimately don’t know how to streamline efficiencies that will save them money down the road.

In our first post for “Evaluating Your Process for Users of Deltek Costpoint or a Similar System,” I’ll examine the role of an AR clerk with my ‘piece of cake’ attempt at automating bill routing.

I had bills created from our ERP and I had Outlook, so I sent two bills to their respective approvers to verify hours were correct so we could bill the services to the client. Then I waited and waited and waited and waited… you get the picture. I followed up via email at least three times over the next week and finally, a week later, I knocked on their doors to see if they had time to review the email I sent.

‘Approver 1′ called me to his desk and had me look at the count of emails in his inbox. Until then, I was unaware that this number could go over 9,999, but there it was. I apologized and helped him find my email. Five minutes later he reviewed it and sent me an email saying we could bill it. Finally, the bill was out the door. I don’t remember whether I had to mail it or email it, but that is of no consequence. Oh, and of course, I forgot to tell my supervisor that I got the bill out the door so she was unnecessarily on my case the next morning.  I’ll try not to make that mistake again.

‘Approver 2′ (let’s call her Amy), asked if I had received her email. She said she responded immediately to each of the messages I sent, so I crept back to my cube and found her responses.  Suddenly I was the culprit in slowing down my own process! “Sorry, this Acme project isn’t mine,” she said. “These should go to Janet, she runs the Acme project.” Ugh! Wouldn’t you know she didn’t even have the courtesy to copy Janet on her response to me. So I just trudged down the hall to Janet’s office and had her review the paper copy. She looked at it briefly and said “yep, looks fine.” Great, I was out her door and happy to get the bill out of the door. Never mind that I forgot to get Janet to initial the invoice to indicate she had approved it and, of course, I forgot to tell my supervisor I sent the bill.  But, hey…bill is out the door, case closed.

Actually, the case was just getting started. The following week, in walks my supervisor. “I got a call from Acme Company’s CFO.  She asked me who Francis Miller was and why we were billing Acme for her travel to Las Vegas.  When I look in our system, this bill isn’t even posted, when did you send it out? Did you get Amy to review and approve this before you sent it out?” Sorry, I said, I forgot to post the bill in the system, and Amy said the project really belongs to Janet, so I got her to review and approve it…..see (as I pulled my copy from the file drawer). But, of course, Janet’s initials weren’t there.  Now my boss is mad at me for sending out an invoice that she thinks I didn’t get reviewed AND I forgot to post it. Swell.

I realized there was A LOT of room for improvement in this process. Problem #1, people are swarmed with email. Problem #2, people change roles and responsibilities a lot. Problem #3, no coordination with the ERP and the approval activities.  Problem #4, I can be my own worst enemy. Why couldn’t all this stuff be linked together somehow, and why isn’t there a way to get things posted in the system without me having to remember every little thing. I’m only human, after all. And this was a simple bill.  I could only imagine – or rather didn’t want to in this case – what would have happened if there had been revisions.

From experience I’ve gathered intelligence on how to sidestep these common pitfalls. Apart from working together as a team, companies always think in terms of making changes to their IT infrastructure. What I believe needs to happen is approaching these pitfalls in terms of changing the process infrastructure. There are no short term ‘quick fix’ changes, but rather logical steps toward automating manual processes that run at the heart of their businesses.Workflow

Step 1

Get people out of email and into a single system for approvals. This will help solve problem #1 and 3. By logging in to a single system for approvals, the approver should be able to get to a “To Do” list that helps them focus on the task(s) at hand. A system that alerts ONLY when an approval is required, and only when this task is “past due,” can assist in decreasing problem #1.

Step 2

Link your system to Deltek Costpoint or a similar platform! Not only does it save time from transferring information into Outlook, but it also ensures that the information will not be incorrectly entered or failed to be entered. Additionally, users can maintain project leads in Costpoint, and can link to a user in the system to automatically assign the approver to the person(s) involved in any given approval process. Problems #2 and 3 solved.

Step 3

Create a workflow that allows for rework, rejection, and handles the issues and items that may need to be addressed when something is “wrong.” That way, the stakeholders that need to be involved can be included automatically based on roles, or by selecting a user from a list of possible issues/departments involved. This decreases the amount of emails sent out for approvals. Assigning a task and automating reminders in the system accomplishes all these things.

Step 4

Solve Problem #4.  Remove yourself from your enemy list.  Relax. Stay out of email. Work on other things. Seriously. At a recent conference I attended, it was estimated that we spend around 28 percent of our work time sending or reading emails. What happens when you remove a single work stream worth of emails from your list of things to do? You can get back a piece of that time to work on other more pressing issues.

If it sounds like I’ve been through this process at least a few times, it’s because I have. Using the power of a business process management tool called Integrify, NeoSystems has automated this and other processes and tied those processes to Costpoint and similar platforms. Throughout this series, I will highlight the ways we have implemented, envisioned, and produced time-saving, compliance-driven processes that integrate with your ERP to create an Enhanced Workflow Automation Framework.

Have burning questions about Process Automation? Feel free to contact me ahead of next week’s blog post.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Collecting Big Data Footprints

May 23rd, 2016 | Posted by Sarah Jones in Guest Blogs | Member Blog Posts - (Comments Off)

This week on NVTC’s blog, the Virginia Commonwealth University School of Engineering shares research on Big Data footprints that the Electrical and Computer Engineering Department is working on with the Huazhong University of Science and Technology.


vcublogXubin He, Ph.D., professor and graduate program director of the Virginia Commonwealth University School of Engineering Electrical and Computer Engineering department, is working with Huazhong University of Science and Technology (HUST) to establish an international research institute focused on creating design techniques to improve data reliability and performance. Coordination efforts are currently underway to create rotation periods for students from VCU and HUST to conduct research within each university’s state-of-the art laboratories.

“This next step in our partnership with VCU helps both universities attract more high-quality research students, while enhancing the breadth and depth of our research,” said Dan Feng, Ph.D. and dean of the School of Computer Science and Technology at HUST. Feng also serves as director of the Data Storage and Application lab at HUST.

Managing big data

Data storage is a booming industry, with lots of opportunities. Just a decade ago, computational speed dominated research efforts and water cooler conversations. According to He, data is more important now. “Data empowers decision-making and drives business progress. No one can tolerate data loss, whether that data represents favorite photos or industry trends and analytics,” added He. And yet, trying to increase data capacity or replace obsolete data systems can shut down vital data centers for days.

Research teams from both universities find creative solutions to global data pain points. For example, these collaborative research teams reduced overhead costs associated with data failures by up to 30 percent. Their algorithms allow businesses to encode data that can be easily retrieved, instead of having to rely on costly data copies or redundant data centers.

Currently, in addition to HUST, He’s team also works with top data storage companies such as EMC, which ranks 128 in the Fortune 500 and had reported revenues of $24.4 billion in 2014.

The network effect

He has a simple philosophy to gauge the success of university research efforts — he looks at who else is there. “At top data storage and systems events such as USENIX’s File and Storage Technologies conference and USENIX’s Annual Technical conference, we’re presenting with peers from Harvard, MIT, Princeton and other premier universities we admire,” said He. These conferences typically accept about 30 presentation papers — that’s less than 20 percent of the global submissions they receive.

“Professor He’s leadership represents one of many efforts to build our international reputation in industry and academia,” said Erdem Topsakal, Ph.D. and chair of the Department of Electrical and Computer Engineering. “HUST is ranked 19 on the U.S. News World & Report’s Best Global Universities for Engineering list. When leading universities like HUST want to work closely with you, you know you’re doing something right.”

For more news from the Virginia Commonwealth University School of Engineering, click here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Protecting Data at Its Core

May 20th, 2016 | Posted by Sarah Jones in Guest Blogs | Member Blog Posts - (Comments Off)

This week on NVTC’s blog, Richard Detore of GreenTec-USA discusses the deep concerned over recent cyber-attacks and offers a solution to prevent data damage.


picforblogEveryone in the cybersecurity field – both inside and outside of government – is deeply concerned over the kind of cyber-attacks that hit federal agencies such as the Office of Personnel Management (OPM) and private companies such as Sony. Rightly so, government agencies and private companies continue to make large investments in cybersecurity.

This sense of urgency extends to America’s key infrastructure, as underscored last October when President Obama issued a Presidential Proclamation on Critical Infrastructure and Resilience. In that proclamation, the president noted that

“Our Nation’s critical infrastructure is central to our security and essential to our economy. Technology, energy and information systems play a pivotal role in our lives today, and people continue to rely on the physical structures that surround us. From roadways and tunnels, to power grids and energy systems, to cybersecurity networks and other digital landscapes, it is crucial that we stay prepared to confront any threats to America’s infrastructure.”

Last year, in testimony before the Senate Armed Services Committee, Director of National Intelligence, James Clapper, noted how cyber-attacks threaten public and private sector interests:

“Most of the public discussion regarding cyber threats has focused on the confidentiality and availability of information; cyber espionage undermines confidentiality, whereas denial-of-service operations and data-deletion attacks undermine availability. In the future, however, we might also see more cyber operations that will change or manipulate electronic information in order to compromise its integrity…instead of deleting it or disrupting access to it. Decision making by senior government officials (civilian and military), corporate executives, investors, or others will be impaired if they cannot trust the information they are receiving.”

And in his most recent appearance before the Senate Armed Services Committee, Clapper stated that “Cyber threats to U.S. national and economic security are increasing in frequency, scale, sophistication and severity of impact.”

According to a recent study published by the cybersecurity firm Tripwire, 82 percent of the oil and gas companies surveyed said they saw an increase in successful cyberattacks over the past year. More than half of the same respondents said the number of cyberattacks increased between 50 to 100 percent over the past month.

Last year, federal investigators uncovered the fact that Russian hackers had penetrated the U.S. State Department in a major cybersecurity breach that gave Russian hackers access to the White House – including the President’s schedule.

Other threats, such as ransomware, are now on the radar screen of key policy makers in Congress, as well as the U.S. Departments of Justice and Homeland Security. Ransomware encrypts a computer user’s information, and hackers then demand payment – usually in the form of crypto-currency such as Bitcoin (which is extremely difficult to trace) – to unlock the information.

In fact, in recent years several police departments have fallen victim to ransomware and have had to make payments to the hackers. One typical example happened in Maine when two police departments were hacked into. To date, the perpetrators in these cases have not been apprehended.

Obviously, protecting and securing data at its core is a key component of cybersecurity efforts for both the public and private sectors. While it is important for cybersecurity efforts to focus on improving detection and enhancing firewalls, one approach that may often be overlooked is better protecting data at its core.

picforblog2Until recently, it was not possible to fully protect data at its core –the hard drive. In 2013, Write-Once-Read-Many (WORM) disk technology was developed and successfully installed that now, for the first time, allows government agencies and private companies to safely secure and protect data at the physical level of the disk. Any and all data stored on a WORM disk cannot be altered, overwritten, reformatted, deleted or compromised in any way within a computer or data center. The WORM disk functions as a normal Hard Disk Drive with zero performance degradation from its additional built-in capabilities. These capabilities prevent data damage from any form of cyberattack.

This new breakthrough combined with encryption makes it impossible for hackers to steal data or render it useless by attacking the stored data, or disks.

In addition to advances in malware and firewall enhancements, comprehensive cybersecurity efforts should take a close look at technologies that protect data at its core. Such efforts will impact the public and private sectors in profound ways.

Richard Detore is a NVTC member and CEO of GreenTec-USA, a technology company based in Reston, VA.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

LeaseWeb’s  introduces six tips to help you find the perfect cloud partner for your business.


By now, the concept of the cloud is ubiquitous, but for many business leaders the idea still presents more challenges than opportunities. Understanding the complicated technology, not to mention the vast array of delivery models, degrees of services and levels of security available, can be a daunting task for companies under pressure to adapt or adopt.

In a new white paper, “Developing a Cloud Sourcing Strategy: Six Steps to Select the Right Cloud Partner,” LeaseWeb gives decision makers the tools they need to formulate an effective cloud strategy or to identify the right cloud partner to executive it. In summary form, these six tips will help you find the cloud partner for your business.

  1. Support and services — For most businesses, concerns about cost, security, vendor management and technology take the lead in the search for a reliable cloud partner. Surprisingly, the ability of  a provider to smoothly and effectively deliver customer support, SLAs and managed services is often minimized or overlooked, at the expense of the customer. When deciding which cloud partner best fits your needs, don’t underestimate the crucial importance of the support and services they make available. It’s the difference between a cloud partnership that takes your business to new levels and one that just adds to your daily hassles.
  2. Architectural alignment — One of the biggest considerations is whether to use a hyper-scale or traditional hosting model. Practically speaking, a hyper-scale provider requires users to be responsible for operational, day-to-day tasks, while hosting providers oversee the day-to-day management of the infrastructure elements. It’s up to you to decide which is a better fit for your technical team and business needs.
  3. Security and compliance — Data centers are a frequent target of malicious attacks, so it’s important to make sure that your cloud provider is prepared for every eventuality. This means everything from physical security and network threat recognition, to regular security audits to updated compliance certifications like HIPAA. Your data is your most valuable asset, so make sure it’s going to be treated that way.
  4. Support for data sovereignty and residency requirements — In tandem with security and compliance issues, data residency is another issue that frequently stalls cloud and hosting projects. The growth of “bring your own device” (BYOD), big data and cloud projects is dragging sensitive data to third-party clouds and data centers. This makes many business owners uneasy, which is why it’s so important to address the location of your data, the laws governing the export of data wherever it’s stored and the security and encryption of that data.
  5. Financial management — Traditional hosting companies typically offer a more basic cost scheme, based upon initial configurations with monthly utilization. This traditional model works well for companies with steady and predictable usage patterns. Hyper-scale cloud services, on the other hand, were built around granular per minute or hourly costs from their inception. Provisioning is primarily self-service and allows users to turn up server, storage and network services. This feature appeals to users who need to spin up environments in near real time and then turn them down when not needed. Consider your requirements to determine which model fits you – or if you want a mix of both.
  6. Cultural and strategic alignment — Cultural fit with your service provider is a key point that never receives enough attention in the RFP process. For nearly all enterprises, using a cloud or hosting provider is truly a new venture, one that requires extensive internal buy-in. For first-time cloud buyers, the ongoing degree of partnership is an unknown factor. Each provider engages and on-boards clients differently.

If you’re in the process of picking a cloud partner for your business, remember that no one becomes a cloud infrastructure expert overnight. But with a smart approach, you can make an informed decision that will lead to great results for your company.

Ultimately, remember that you will only achieve the higher-performance and lower-cost environments you are aiming for by choosing the provider that fits your needs and requirements best.

To learn more, visit us at here to receive our full white paper on selecting the right cloud partner today.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

May is Mental Health Awareness Month. Innovation Health Chief Medical Officer Sunil Budhrani urges Virginians to start a conversation about mental health. Budhrani moderated the Digital Health panel at NVTC’s Health Care Informatics & Analytics Conference on May 5.


Even for a medical expert, mental health can be a difficult topic to talk about.

I know the terminology, proper treatment plans and resources. But as a society (even among health providers), we often don’t know how to talk to those in need of mental health support – sometimes including ourselves. It’s uncomfortable. It’s emotional. It’s personal. So we don’t share. Don’t ask. Don’t act. And suicide rates across our nation skyrocket.

We need to talk about mental health.

When I joined Innovation Health as Chief Medical Officer last month, I sat down with my team and we made a collective decision. We decided to speak from our own personal experiences with mental health, however imperfectly. Because talking about mental health is the best way to truly help remove the stigma associated with mental health conditions.

Working as an ER doctor, I frequently saw patients whose anxiety and depression had gone unmanaged and ultimately led them to attempt suicide. Some I was able to help. For others there was nothing I could do. I realized that many times these patients weren’t getting the help they needed because they feared being labeled or misunderstood. Time and again, I saw that the cost of not treating these symptoms could be fatal.

Now, after so many years, so many news reports, and seeing so many of my colleagues and friends struggle, it is clear to me that we must confront the topic of mental health head-on if we are truly going to make a difference.

May is Mental Health Awareness Month and I hope it will be a catalyst for this critical conversation, which impacts so many Americans.

The proof is in the numbers: according to the National Institute of Mental Health, nearly one in four adults and one in five children in the U.S.  has a diagnosable mental health condition. In Virginia, more than 230,000 adults – roughly 3.8 percent of the population – have experienced a serious mental illness. These facts tell me one thing; we are not alone. We all know someone, work with someone, or love someone who struggles with mental illness. We may struggle with it ourselves. The fact is that anxiety, depression and substance abuse touch every community. The time to accept this is now. The time to speak up and reach out is now.

Many people don’t get the services they need because they don’t know where to start. If you or someone you know is struggling, you can start the healing process by following these three steps:

  1. Talk to a primary care physician about your mental health. They can help connect you with the right mental health support. If you do not have a PCP, I highly recommend you select one for your general health care needs.
  2. Educate yourself. Visit the Innovation Health website to take a depression or anxiety assessment or call 703-289-7560 to schedule an in-person assessment with a trained counselor.
  3. Be proactive about mental well-being. If you know someone who may be experiencing symptoms related to a mental health condition, encourage them to get the help they need.

It is never easy or comfortable to approach situations like this, but as a community we can’t let our fear or doubts stop us from helping others or ourselves dealing with mental illness. Talk about metal health with your family, friends and colleagues not just this month, but all year.

Together we can work to build a healthier world. But first, we have to start the conversation.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Kristin D’Amore of Dovel Technologies provides a look into how Virginia is supporting student innovation, an essential asset to the Commonwealth’s economy.


New businesses account for nearly all net new job creation and almost 20 percent of gross job creation as well as being responsible for a disproportionate share of innovative activity in the United States.* There is an enormous amount of entrepreneurial activity occurring at institutions of higher learning throughout the country, and Virginia is taking strides to strengthen student innovation on its campuses. On April 14, 2016, Governor Terry McAuliffe signed into law legislation that directs the Boards of Visitors of public colleges and universities to adopt intellectual property (IP) policies that are supportive of student entrepreneurship. The legislation, which was sponsored by Del. Charniele Herring, was supported by NVTC and a broad coalition of higher education and business community organizations across Virginia.

The legislation reduces some barriers to entry for student entrepreneurs by clarifying existing university IP policies to specify the conditions under which institutions of higher education own intellectual property as opposed to student ownership. Current policies at some institutions of higher education create uncertainty about IP ownership, which discourages students from launching new ventures, starting businesses, or commercializing research based on their own ideas. The bill encourages a campus culture that supports entrepreneurship and motivates Virginia’s universities to be hubs of creativity and innovation with the potential to drive regional economic growth through research commercialization and new business formation.

The issue of student entrepreneurship and IP rights was raised by the Governor’s Council on Youth Entrepreneurship, which was formed in August 2015 to study and recommend ways to support young business owners and innovators in the Commonwealth. The group is comprised of leaders in higher education, business, innovators and entrepreneurs. As a member of the Council, I was pleased to see an early win for young entrepreneurs and students across Virginia.

Increased student innovation and promoting IP commercialization and new patents by students is critical to growing Virginia’s economy.  Statistics from the Council on Virginia’s Future show that although Virginia’s rate of patent formation has improved in recent years, it is still well below the U.S. average. Furthermore, Virginia universities generated 1.94 startups per one million residents in 2013, measurably below the national rate of 2.38 startups and ranking the Commonwealth 27th in the country.

The Council on Youth Entrepreneurship is continuing its efforts assessing resources and opportunities in Virginia for young entrepreneurs and will be presenting additional recommendations to the Governor later this year.  The Council will make additional recommendations on areas including financial incentives for business formation, improving regulatory processes for entrepreneurs, strengthening academic programs for student innovators in K – 12 and higher education, and marketing the assets of Virginia’s education system to students, faculty, and business leaders across the country.  The Council’s efforts are focused on providing the next generation of entrepreneurs and innovators a solid foundation from which to launch their ideas, ultimately leading to further growth in the economy.

* According to the Kauffman Foundation, the largest foundation in the world devoted to entrepreneurship.

Kristin D’Amore is Director, Market Development and Strategy at Dovel Technologies and a member of Governor McAuliffe’s Council on Youth Entrepreneurship. 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

 This week on NVTC’s blog, George Lavallee of NeoSosytems offers five challenges and solutions to help customers receive the most out of CER.


Deltek Costpoint Enterprise Reporting (CER) provides customized, data-driven business intelligence (BI). Fueled by the IBM Cognos 10 analytics engine, CER enables organizations to mine their Costpoint data to provide consistent snapshots of performance, view historical trends and predict results.

CER is the king of BI. With state of the art capabilities, there are also potential roadblocks (employee turnover, user experience levels and changing data/accounting needs, other systems within the enterprise) that can keep companies from fully reaping these benefits. Unresolved, these challenges can cause inaccuracies, consume months of time and erode leader confidence.

1.   Powerful Customization Capabilities

While running CER reports is simple, developing them can be challenging  for users unfamiliar with Costpoint’s underlying data structure. To solve this problem, Deltek created standard reports (CER Reports) covering everything from project management to payroll to procurement. These prebuilt templates enable users to quickly generate reports that capture the most commonly used  fields across a wide swath of businesses. However, they may not include user-defined fields or specific metrics that you have implemented for your specific needs.

To capture these data, enterprises will want to build custom reports or modify existing standard reports. Creating or modifying Cognos reports requires a strong working knowledge of both the Cognos tool as well as the structure of the Costpoint database. For example, you can’t simply click a button to tailor a report to your accounts payable process or labor management structure. You will need to understand where the pertinent data elements reside and how to access them using the Cognos toolset. Many intermediate users lack the skills to effectively craft custom reports.

2. Complexity of Government Contracting Accounting Data

Costpoint uses more than 1,800 inter-related data tables that capture a wealth of information about your company. Access to this complex store of data can be of great benefit to your organization, but creating a report that captures the data relevant to your needs challenges many organizations. Many users are unsure about which data to query and how to convey it on a well-designed report. Common questions include:

  • What data tables do I access?
  • How do I arrange the data?
  • Which charts do I use?
  • What rendering options are best?

Additionally, most organizations have budgets and forecasts and want to integrate this data with actual results within their reports. You may have used another system to create your budgets, such as TM1, Adaptive or even Excel. Accordingly, integrating this data into reports generally means pulling data from systems outside Costpoint, further compounding complexity.

3. Robust Security

Increasingly in today’s world, data breaches are affecting companies in all sectors. Breaches tarnish a company reputation, expose data and sabotage audit requirements. Fortunately, Cognos and CER deliver robust and highly configurable security controls. The hundreds of available settings, however, can stymie many organizations.

Novice and even intermediate users may not realize that out-of-the box Cognos installations may not incorporate security settings that are optimal for their company’s situation. Organizations could inadvertently expose proprietary data and confidential employee information. For instance, you might assume that granting access to the projects package would only enable users to see project-related data but close examination of that data reveals that confidential employee information may be included if it is not properly secured by appropriate user-specific security profiles. The default security settings may not be sufficient to provide the degree of security required in rapidly changing environments.

4. Analytics Development and Optimization

Unless you have a dedicated analytics staff with the required expertise, individuals developing your reports may not be effective. Why? Developing reports and maximizing efficiency requires experience with both Costpoint and Cognos.

Often, employees tasked with reporting business intelligence have other duties. CER management is a part-time responsibility. They may have deep functional knowledge, but minimal understanding of Costpoint data structures and Cognos query and reporting requirements.

If you are caught in this situation the results can be challenging. Inexperienced users take 10 to 20 times longer to develop a report than an expert. Reports can be late compressing the time available for meaningful analysis as well as diverting time away from other business-critical duties, which may better align with their hired skillset.

5. Meaningful Results

Inexperience can also cause inaccuracy in reported results. Novice users may query the wrong data or omit data that would dramatically improve the reports usefulness. Such mistakes could put contracts at risk or result in poor or ill-informed business decisions. You want your reports to carry the most meaningful data possible.

Imagine you’ve tasked your logistics manager with developing an incurred cost submission report. That individual skillfully maintains your supply chain. But he or she doesn’t use Costpoint every day and may not understand which direct and indirect cost tables to query. Your report might miss critical costs or include unallowable items.

Errors like these erode leader confidence. Just one inaccurate report, and senior managers may mistrust all your data outputs. You’ve damaged your reputation and possibly jeopardized your contracts simply because you didn’t fully understand the data structures and how to best capture the data that conveys the most meaning.

Signs You Need Help

How do you know if these challenges are hindering your data analytics? Talk to your users and business managers. If you hear the following, you are probably underutilizing the power of CER or you might need a CER tune-up.

  1. Reports are consistently late.
  2. More time is spent collecting data than analyzing it.
  3. Executives don’t trust data accuracy.
  4. Significant manipulation is required to analyze data.
  5. Reports have unusually long run times.
  6. Project managers lack administrative visibility (i.e., they can’t see unpaid invoices, approaching funding ceilings, missing bill rates, etc.)

Consequences of Inaction

Inexperienced users waste time collecting the wrong data and not enough time analyzing results. Inaccuracies cause executives to doubt the validity of all your reports. Poor security can expose proprietary data and compromise audit results. Depending upon the number of users, solving these challenges could save your company months of labor and lead to better, more timely business decisions.

If you owned a Telsa, you would make sure you were trained and educated in how to fully utilize it. At a minimum you would take it to experts to make sure it was operating efficiently and effectively.

What You Need to Know:

To fully benefit from Deltek CER, companies should routinely assess their CER configurations, processes and output. A CER tune-up is easy and should be a standard operating procedure within all Deltek organizations. Maximizing the power of CER will allow companies to reap substantial benefits, overcome any “obstacles” and enable their organizations to succeed.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Who Owns the Data?

March 22nd, 2016 | Posted by Sarah Jones in Guest Blogs | Member Blog Posts - (Comments Off)

The greatest meaning of “big” in Big Data is the role of data in the digital economy. The question who owns the data is big too. With IoT and cloud, data ownership will matter soon even to those who don’t care now. Svetlana Sicular of the Gartner Blog Network explores this issue in this week’s blog post. 


The greatest meaning of “big” in Big Data is the role of data in the digital economy. The question who owns the data is big too. With IoT and cloud, data ownership will matter soon even to those who don’t care now.

And there is no universal answer — data ownership is culture-specific. In some cases, nobody wants to own the data, in other cases, everybody wants to grab a piece (“it’s mine!” although the “owner” didn’t even know before you asked that this data existed). With participating external parties, things are even more complicated: for example, one party might learn that it does not have rights for the data it considered its own.  To solve ownership — but not alleviate the problem! — some organizations decide that data belongs to their customers, citizens or third-parties, and the company is only a custodian.

What successful approaches to data ownership have I seen?

The universal first step is establishing an institute of data governance.  I just published a research note on how to do this: EIM 1.0: Setting Up Enterprise Information Management and Governance. You don’t have to call it “data governance.” It could be “data advocacy” or simply a name reflecting the nature of taking care of data. It should resonate with a specific organizational or ecosystem culture.

The next steps would be specific to the culture and the nature of the business: figuring out what data is most vital. This will narrow down data ownership to the decisions that matter (which will save a lot of grief and lots of hours).

The versions of data ownership I have seen:

  • Information governance mechanism resolves it through top-down decision making.
  • Subject matter experts make a step forward to own the data on which they are SMEs (bottom up).
  • Application business owners are offered to own the data, accept it and take it (unexpectedly) seriously, which is fruitful to everyone.
  • Data operators become de-facto data owners (which could be a solution, but could be a greater problem). Transparency in what is being done with data and explicit data access rules make it a solution.
  • When data ownership is hard to resolve on the high level, going more granular, and resolving data elements’ ownership (which is usually more obvious), answers the question.
  • A business executives assumes data ownership. The worst case is when such ownership belongs to an executive who has control, but has no idea about data. E.g. the executive owns the data, but does nothing because executives are busy doing other things. The best case is when this executive is a sponsor of data-related work.

The ownership is just part of taking care of the data. Look at the root of the issue: who can do what with which data without stepping on each other’s toes, avoid troubles with regulations and ensure you put data to work ethically. Data governance often starts with compliance and ownership, but — unavoidably — it ends up finding value in the data, which is big in the digital economy.

Follow Svetlana on Twitter @Sve_Sic. For more on Big Data, check out NVTC’s Big Data and Analytics Committee.  The NVTC committee is hosting its first conference on May 5 which will explore health care informatics and analytics.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Overcoming Obstacles in Entrepreneurship

March 14th, 2016 | Posted by Sarah Jones in Guest Blogs | Member Blog Posts - (Comments Off)

This week on NVTC’s blog: on Feb. 18, the NVTC Small Business and Entrepreneur Committee sponsored a fireside chat entitled “Journey to Success: Overcoming Obstacles in Entrepreneurship.” Nate Miller, an assurance intern at Aronson LLC, shares the top tips from that event, which featured a fireside chat with Gary Shapiro, president and CEO of the Consumer Technology Association, and Scott Case, the founding CTO of Priceline.com and founding CEO of Startup America. 


On February 18, the NVTC Small Business and Entrepreneur Committee sponsored a fireside chat entitled “Journey to Success: Overcoming Obstacles in Entrepreneurship.” The event was moderated by Gary Shapiro, President and CEO of the Consumer Technology Association, who interviewed Scott Case, the founding CTO of Priceline.com and founding CEO of Startup America.

The discussion focused on Case’s experiences throughout his lengthy career as a key member of several startup companies; Case also shared his thoughts on the potential pitfalls startup companies can fall victim to. Some of Case’s key points to attendees included:

  • The time commitment required to be a successful entrepreneur.
  • The importance of creating and expanding connections in a growing area such as the D.C.
  • Overcoming failure repeatedly is a necessary trait to success.
  • The importance of maintaining a valuable network with the appropriate resources.

Case’s entrepreneurial journey started with part time jobs as an assistant for a plumbing company and a well driller, and running a local lawn-mowing business, where he gained an understanding of mastering his trade, servicing customers, and constantly looking for the next opportunity. While attending college at the University of Connecticut, Case spent his free time developing an advanced flight simulator with fellow classmates only to discover that the startup could not generate sales due to a lack of marketing. As a result, a key lesson that Case carries with him to this day is the need to supplement a great product with a sales and marketing team and other supporting functions.

Undeterred, Case shunned more secure employment opportunities to continue working with startups and, following an introduction to Priceline.com Founder Jay Walker a few years later, joined Priceline as the founding CTO. During his tenure, Case’s team at Priceline developed a “name your own price” system in the early days of the internet that allowed the company to grow significantly and successfully undergo an IPO with an initial market capitalization of more than $12 billion. Case attributes his success at Priceline to his understanding of the available technology, as well as the ability to effectively market it and while creating a team willing to try a number of ventures without fear of failure.

Since leaving Priceline in 2000, Case has co-founded or led several other ventures focused on technology, entrepreneurship and philanthropy such as Main Street Genome, Startup America Partnership, Malaria No More, and most recently, Potomac Innovation, a new business travel purchasing company. Case stressed the need for entrepreneurs to remain connected to advisers, financiers, peers and customers if they are to be successful, and noted that incubators, such as 1776 in D.C. and others in startup hubs such as New York City and Silicone Valley, can greatly help a new entrepreneur. Case also stressed the importance of minimizing government regulation to allow new business owners to focus on their core activities, but noted that government can help by listening and responding to entrepreneurs’ needs – such as the recent decision by the city of Nashville, Tennessee to significantly increase its broadband access, which appears to be attracting entrepreneurs to the city.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

3 Reasons Why M&A Will Continue to Thrive in 2015

February 17th, 2015 | Posted by Sarah Jones in Guest Blogs | Member Blog Posts - (Comments Off)

This week on NVTC’s blog, guest blogger Gretchen Guandolo of member company Clearsight Advisors discusses the success of M&A in 2014 with the return of gargantuan deals, largely seller-friendly transaction structures and premium valuations, and offers three reasons why 2015 will be just as successful.


dollar-exchange-rate-544949_1280In what was widely considered a banner year for M&A, 2014 was the return of gargantuan deals, largely seller-friendly transaction structures and premium valuations. In spite of the turbulent equity markets being driven by fluctuating oil prices, a gathering storm in Europe, and uncertainty around rising interest rates, we at Clearsight are already seeing the makings of a very big M&A year. Globally, investment banks are seeing increased deal flow and expanding pipelines. Our team is already out to market with several deals that are garnering high demand and premium valuations from a number of unique buyer groups. We expect the rising M&A tide to continue through 2015, as we believe demand for niche leadership positioning, strong growth trajectories, and seasoned management teams is unlikely to dissipate. First, a few fun facts from 2014 that will continue the momentum through 2015:

  • In 2014 there was $3.5 trillion worth of global M&A activity, which is up 47 percent from the year before
  • Global private equity investments totaled at $561.9 billion. That’s the highest figure since 2007, and a 43 percent bump over 2013 – with 60 percent of 2014 buyout activity focused on add-on investments
  • Venture capitalists disbursed a massive $87.8 billion (compared to $50.3 billion for 2013) via 7,731 deals
  • Companies raised around $249 billion in global IPOs in 2014, which was the busiest year for new listings since 2010

So what do we expect for this year?

  • There is likely to be a frenzy of activity in certain verticals, including: healthcare, energy and technology. Technology continues apace with no sign of slowdown and while the energy sector is harder to predict, one thing is clear – disruption in a regulated industry makes for a great M&A environment
  • Investor interest in certain technologies is likely to grow. Some of our favorites include: customer experience, big data, and human capital management. Technologies that enable us to get into the minds of customers and lead them on a journey to experience and buy a product has become the goal of retailers, financial services companies and even government! We see the market of big data continue to evolve and mature. This year will be a great growth year for data analytics consulting businesses who leverage Hadoop and other open source technologies to deliver predictive behavior, lower costs and drive increased revenue. Human capital technologies will continue to surge as employers seek out the best talent and retain and train individuals in a hyper competitive market.
  • As seen in 2014, both private equity and strategic acquirers will drive robust market competition. Nearly all of our processes include both strategic and financial buyers and as private equity grows increasingly aggressive in pricing in an effort to put money to work, we see strategic buyers dominating 2015.

Growth will continue to be the main driver of valuations throughout 2015. Premium multiples go to the companies with a demonstrated high growth track record and robust pipeline for future growth. Growth eclipses profitability through 2015.

 

 

 

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS