We continue to share content from our inaugural Capital Data Summit that took place on February 15, 2017 at The Ritz-Carlton, Tysons Corner. Scroll down to view DigitalGlobe’s Dr. Scott full Capital Data Summit keynote presentation video and his presentation slides.

Transp Data Summit Logo - TransparentThe 2017 Capital Data Summit was highlighted by its keynote speakers:  Clarabridge Founder and Vice Chairman Sid Banerjee, Hewlett Packard Enterprise President and CEO Meg Whitman and DigitalGlobe Founder, Chief Technical Officer and Executive Vice President Dr. Walter Scott.

In his keynote, Dr. Scott discussed how satellite imagery and remote sensing are harnessing big data at a global scale. He explained trends that are enabling large scale analytics and shared how DigitalGlobe’s imagery and analytics solutions are being put into practice to serve a variety of customers and social causes. View Dr. Scott’s presentation slides here.

Dr Scott Clip Slides

View the full video of Dr. Scott’s remarks:

Check out the Capital Data Summit photo gallery!

Stay tuned! We’ll be sharing more content from the Capital Data Summit here on the NVTC blog over the coming weeks!

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Our newest guest blog is written by C. Michael Ferraro, president and CEO of Training Solutions, Inc. (TSI). TSI provides diverse performance development training programs, human resources services, executive coaching, workforce consulting and facilitation of retreats for a variety of companies across the nation and worldwide. In his blog, Ferraro shares five behaviors for building thriving intact teams.


tsi logoIn most companies, there are ongoing challenges for increased productivity and good employee relationships; and the concept of successful teamwork is always a part of the puzzle. Many companies are uncomfortable looking at team problems. Those companies are aware of the challenges they face with teams trying to work better together, but they don’t know how to “fix” the problems because they don’t know what is causing the problems.

Intact teams really need to thrive. Whether individuals believe it or not, the goals of the team should take priority over individual goals in order to truly succeed. A team should look “inside” itself, being honest about their challenges. There are five key behaviors that every intact team should understand, according to Patrick Lencioni. Lencioni has worked with thousands of senior executives in organizations ranging from Fortune 500 and mid-size companies to start-ups and nonprofits. Lencioni is the author of many best-selling business books including The Five Dysfunctions of a Team that has sold over two million copies and continues to be a fixture on national best-seller lists week after week.

His newest program is called The Five Behaviors of a Cohesive Team (based on his best-selling book, The Five Dysfunctions of a Team). Based on Lencioni’s model, the five key behavioral areas are Trust, Conflict, Commitment, Accountability and Results. Every individual on the team completes an online assessment and then the team report is developed. The assessment-based approach is extremely powerful. I am an authorized partner of this program and work with many teams to improve their working relationships and results.

These five behaviors are very important to the success of team and need to be honestly discussed in depth with the team. I walk through the assessment results with all the team members together and help them work through their challenges as they learn to develop into a stronger and more successful team.

Building Trust. This is a fundamental and foundational behavior of a cohesive team. Lencioni believes that trust means (1) “a willingness to be completely vulnerable with one another” and (2) “confidence among team members that their peers’ intentions are good and that there is no reason to be protective or careful around the team.” Can you say that you know many trusting teams? Teams always need to begin with a discussion regarding trust to be open and honest about their issues. Without high trust, the rest cannot be improved.

Mastering Conflict. Productive conflict is needed to have relationships grow that can last over time. Conflict is needed and shouldn’t be considered “off limits” in the workplace. Healthy and productive conflict is not focused on the individuals but connected to subjects and ideas. Productive conflict should not be avoided. Great teams need thoughtful debates to provide great solutions. When I facilitate this program, we also talk about those unhealthy behaviors around conflict and give the team a team map they can use to help them use conflict constructively.

Achieving Commitment. When talking about commitment, Lencioni is focused on the team’s decisions and making sure there is complete buy-in among every member of the team. Of course, some members may disagree at different points in the discussion; but in the end, there should be complete agreement to move forward with the decision. Great teams understand the importance of the commitment. Without total commitment, there will be ambiguity or missed opportunities or low confidence and more.

Embracing Accountability. This means, per Lencioni, the “willingness of team members to call their peers on performance or behaviors that might hurt the team.” This is very important because team members must feel comfortable sharing their discomfort with another team member about their performance or behavior. This feedback is given from the perspective of wanting to help the team succeed and should be appreciated.

Focusing on Results. Of course, going through the discussions and learning about the four previous behaviors is to produce increased positive results. You’d be surprised how often a team isn’t focused on team results. Perhaps team members are focused on their own individual status, position or results. The collective success of the team is most important.

This assessment-based program will give a team a great opportunity to learn more about each member of the team and how to work better together to be truly a great team. The results we have had with teams has been dramatic, showing better team harmony and increased productivity. We also offer to come back to the team after a period of months and reassess the team in these areas.

To learn more about improving intact teams click here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Check Out the Capital Data Summit Photo Gallery!

February 22nd, 2017 | Posted by Alexa Magdalenski in Capital Data Summit - (Comments Off)

Transp Data Summit Logo - TransparentOne week ago, NVTC hosted the inaugural Capital Data Summit at The Ritz-Carlton, Tysons Corner. With over 300 attendees, the Summit highlighted our region’s unmatched set of data analytics assets with keynote remarks by Clarabridge Founder and Vice Chairman Sid Banerjee, Hewlett Packard Enterprise President and CEO Meg Whitman and DigitalGlobe Founder, Chief Technical Officer and Executive Vice President Dr. Walter Scott, panel sessions led by data analytics experts from the public, private and academic sectors, and a top technology showcase.

We just launched the photo gallery from the event. Click here to view the full gallery and feel free to share and download!

Capital Data Photo 1

Capital Data Photo 2

Banerjee Keynote 1 Web Article

Whitman Fireside

Dr Scott Keynote 3 Web Article

We’ll be sharing more Capital Data Summit content and videos here on the NVTC blog. Stay tuned!

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Our latest Capital Data Summit guest blog is from Srdjan Marinovic, CTO of DC-based startup Wireless Registry. Wireless Registry indexes the world’s detectable WiFi, Bluetooth and Bluetooth Low Energy (BLE) signals in a single, searchable platform. Marinovic will be a panelist on the Identity Management in the Age of the Internet of Things panel at the Capital Data Summit taking place on February 15, 2017 at The Ritz-Carlton, Tysons Corner.


WRThe proliferation of Internet of Things (IoT) devices poses new security challenges. Sensitive IoT use cases, like financial transactions and access control, must be secured against novel attacker models on an unprecedented scale. Significantly, traditional identity verification and authentication methods ‒ reliant on connection and the exchange of secrets (passwords or keys) ‒ are impractical across so many billions of wireless devices. As a result, Wireless Registry envisions identity for the IoT as a “secret-less” cloud solution constructed from IoT contexts.

An IoT context is a set of wireless devices in proximity to each other. The WiFi, Bluetooth and BLE signals these things emit create temporal clusters. For example, a home context may comprise a WiFi router, Bluetooth speaker, TV and a smartphone; a transportation context a number of cars on a highway and the surrounding urban infrastructure; a payment context several routers, a point of sale and customers’ BLE wearables. Devices can leverage IoT contexts as identities that inform trust-based decision-making prior to taking an action. A driverless car, for example, may augment its assessment that it is, in fact, parked in front of an airport terminal prior to unlocking for a passenger, based on the (anonymized) “fingerprint” of detectable signals around it.

Devices move around, so proximity is constantly changing. At any given moment, wireless things enter and exit IoT contexts. Signal contexts are dynamic, both for fixed spaces and for the individual mobile devices that pass through them. A smartphone will develop a temporal identity ‒ its own IoT context ‒ based on patterns of detections of things it encounters as it moves through the world (its owner’s wearables, item tracker, car, home and work WiFi contexts, and much more). Consequently, prior to enabling a mobile payment transaction, an app may rely on several different trusted IoT contexts to support not only that it is close to a payment terminal and inside a retail store, but also alert to abnormal usage patterns that could indicate possible theft or loss of that smartphone.

In summary, whereas legacy paradigms continue to inspire security approaches that involve keys and secrets, Wireless Registry suggests that identities for the IoT may be constructed and verified through IoT contexts. Indeed, although traditional solutions are adequate on smaller scale enterprises, on a global scale they are not feasible, and thus contextual identities are the solution.


With a Ph.D. in computer security from Imperial College London and a research background in information security at ETH Zurich, Srdjan Marinovic is CTO of Washington, DC-based startup Wireless Registry.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

In his guest blog, Earth Networks Chief Marketing Officer Anuj Agrawal shares an inside look into the power of environmental data and how it is making cities smarter, cleaner and more resilient. Agrawal will be speaking on the Smart Cities Panel at the 2017 Capital Data Summit taking place on February 15, 2017 at The Ritz-Carlton, Tysons Corner.


EN new color logoYou interact with environmental data on a daily basis. It’s there when you turn your smartphone on to silence your alarm. It’s on your TV when you sip your morning coffee. And it’s on the radio during your commute to work. Free weather data and other types of environmental information helps us pick out our outfits, time our commute and plan our after-work activities. But can it do more than that?

We here at Earth Networks think so. And so does the Smart City Council Readiness Program. All ten of the Smart Cities Council Readiness Challenge Grant finalists named energy or some form of the environment as their top priority. This is because weather and greenhouse gas (GHG) data play vital roles in making cities more livable, workable, sustainable and resilient.

Commercial-grade weather data provides more than just a weekly forecast. In fact, its diverse capabilities make it a key component for resilient cities. Weather data affects a city’s population and some aspects of the economy; offering insights that no other data set can provide. Advanced weather data feeds and historical data is easily integrated into predictive models that can provide cities with smart decision-support so that you can plan for both routine and severe weather events.

As people begin to move out of the suburbs and into the cities, city pollution is on the rise. The scientific community estimates that 70% of GHGs are generated in urban areas. GHG data is so important because if you can quantify the amount of emissions in your city, you can make steps towards controlling it. With GHG data, smart cities can develop smart policies to reign in those emissions and have local, accurate baseline data to support their initiatives.

Environmental data is critical for smart city development. Both weather data and GHG data offer insights that can make communities safer, healthier and, ultimately, smarter. To learn more about how weather and GHG data can both help mitigate financial, operational and human risk in smart cities developing across the world, don’t miss the Smart Cities Panel at the Capital Data Summit on February 15, 2017.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

NVTC has published the second infographic in its new research series. In this data analytics-focused piece, we’ll take a look at the data behind the compelling numbers in the infographic.

NVTC Data Infographic 2017

Big Data and Analytics Drive Growth

By 2020, the amount of digital data produced will exceed 40 zettabytes, which is roughly equivalent to 6,080 years of HD video, a period longer than recorded history. More data has been created in the last two years than in the entire previous history of the human race. The total amount of data captured and stored by industry doubles every 14.4 months.

Over 3.5 billion Google searches are made daily and 205 billion emails are sent each day. Facebook has over 30 petabytes of user-generated data and Twitter sees over 230 million tweets daily; for scale, one petabyte is roughly equivalent to 13.3 years of HD video. Gartner projects that connected devices making up the Internet of Things (IoT) will grow by two billion to over 8.4 billion IoT devices in use this year, representing one billion more devices than people on the planet.

However, less than 0.5 percent of data created is analyzed or used.

Funding & Revenue

Venture capital funding for Greater Washington companies offering analytics products and services totaled over $93 million in 2016; the largest single funding round was $12 million for ThreatQuotient, announced in August.

According to Gartner, consumers and businesses spent nearly $14 billion on devices, which is expected to grow to $17 billion in 2017 while IoT services are expected to reach $273 billion.

IDC projects double-digit growth – a CAGR of 11.7 percent – in the big data and analytics market through 2020, growing from $130.1 billion in 2016 to $203 billion in 2020. IDC also projects that the IoT market will reach $1.46 trillion with global IoT revenue of $7.06 billion by 2020, with retail and manufacturing as the two leading IoT industry verticals.

Workforce

Analytics talent continues to be in high demand with no signs of slowing in the immediate future. Glassdoor named data scientist as 2016’s hottest job across all business sectors, not just IT. Data engineer was ranked third and analytics manager ranked fifth on this same list. One-third of surveyed NVTC members indicate they are currently hiring data analysts.

Conclusion

Big data and analytics is undeniably a growth industry by every possible measure and no region is better equipped to meet the demands emerging from the new data ecosystem than Greater Washington. With its leading data analytics firms, highly-skilled workforce, expertise serving federal, state and local government customers, and many outstanding academic institutions, Greater Washington’s big data assets are unmatched.

Click here to learn more about NVTC’s research initiatives or email NVTC Research and Strategic Initiatives Manager John Shaw.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Big Data: An Essential Public Asset

February 6th, 2017 | Posted by Alexa Magdalenski in Capital Data Summit | Guest Blogs - (Comments Off)

In his Capital Data Summit guest blog post, District of Columbia Chief Data Officer Barney Krucoff discusses the District’s new data investment and data accessibility plans and the importance of leveraging big data in municipalities today. Krucoff will be a speaker on the Role of the CDO panel at the 2017 Capital Data Summit on February 15, 2017 at The Ritz-Carlton, Tysons Corner.


OCTO DCThe District of Columbia government has consistently blazed the trail to public data access. Whether real-time traffic patterns or invaluable health statistics, the data collected and managed by the District is an essential public asset. Just as important as our schools, roads and buildings, we would not have a functioning city without data.

Now under the leadership of Chief Technology Officer Archana Vemulapalli, the Office of the Chief Technology Officer (OCTO) more than ever before understands the need to leverage the District’s data investment and the significance of data accessibility. OCTO launched drafts.dc.gov, DC’s version of the Madison Project, to gather public comments and feedback from the public, tech and open government activists, civic groups, and government agencies.

Now, our comprehensive Open Data Policy represents a consensus of viewpoints, balancing safety, privacy and security concerns while mandating openness and transparency. The opportunity to share more data realizes Mayor Muriel Bowser’s commitment to use technology to innovate, increase transparency, and improve accountability across the government.

The final version of the Open Data Policy will modernize and augment the District’s central data catalog. In turn, the public, media, entrepreneurs and academics will gain greater access to a variety of data sets. And, to build on their feedback, stakeholder and resident input will be the barometer by which we measure policy success. These insights will allow OCTO to learn from successes and shortcomings while planning for a secure, yet transparent, future. Our collective knowledge and expertise makes us more effective and better positioned to become one of the most open jurisdictions in the country.

Learn more about D.C. OCTO here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Embracing Big Data

February 2nd, 2017 | Posted by Alexa Magdalenski in Capital Data Summit | Guest Blogs | Member Blog Posts - (Comments Off)

The inaugural 2017 Capital Data Summit is less than two weeks away! Susan Burke, vice president, single family data delivery services at Freddie Mac, will be participating on the Role of the CDO panel at the Summit. In her guest blog post, Burke shares thoughts and questions leaders should consider when aligning big data activities with their organization’s business goals. 


Freddie-Mac-Web-LogoLike many enterprises, we at Freddie Mac are in the process of determining what big data means to us. This past year, we started our journey to expand from the traditional rationalized data stores to the world of unstructured data and new technologies. One step on that path was considering a Hadoop environment. Would it bring value? What business problems would it solve?

The first lesson we learned is one that we tend to see repeated in the IT-business world. In our case, we raced to develop a new technology. Why wouldn’t we, when there were “obvious” value propositions we knew we could deliver? It quickly became apparent that, while our initial proof-of-concept provided insight into what needed to change from a back-end engineering perspective, we had not aligned with the business. Without a strong business champion, technology for the sake of technology efforts is doomed.

Fortunately, a strong, respected business leader stepped up to garner support and help IT define the business-use cases that would deliver value to the organization. Off we went.

Now that we’ve implemented Hadoop, what does it mean? How do we support it? Where do the data scientists fit into this picture? The technology in the data landscape is changing fast — and it is evolving in ways that were unimaginable just a few years ago. That means IT organizations are changing. The skill sets needed are different. The delivery methods are different. The way we integrate into our existing environments is different. We need to ask, “What hypothesis do I want to explore?” instead of “What are the requirements?” And all change is hard.

We decided the most useful model for Freddie Mac is one that combines business and IT resources in one team, so we created our Big Data Center of Excellence (CoE). This CoE brings together dedicated resources to support the development of use cases, deliver data not currently available on the platform and measure value. The data scientists remain in the business areas and can concentrate on asking new questions and executing analytics and visualizations.

The rapidly-evolving world of big data is exciting, and both IT and the business are in it together. We will continue to partner closely with our business leaders to identify the most impactful structure to help us evolve to a data lake structure.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week’s guest blog is by Joseph Norton, Consultant, Information Management at LMI. Norton shares strategies for improving your organization’s enterprise architecture to better fulfill your mission.

Modern Digital TechnologiesEnterprise architecture improves how organizations develop their strategic plans, make investment decisions and establish effective enterprise governance. Federal agencies can use enterprise architecture to make their operations more efficient as well as promote strategic and innovative initiatives.

Laws, such as the Federal Information Technology Acquisition Reform Act (FITARA), require federal agencies to provide more transparent reporting on IT portfolios. These laws are so new, many best practices on IT portfolio reporting are still being developed.

LMI works with the General Services Administration (GSA) to better connect its budget and IT portfolio management processes to help with its reporting. This approach helps GSA answer the following questions:

  • How do we decide when to enhance, migrate, or retire our applications?
  • What functions do those applications support and what are their life cycle costs?
  • What technology standards are approved for use?
  • How should we introduce new or emerging technology?
  • Are we engaging with all of our stakeholders on the right projects and at the right time?

In many cases, the ability to rapidly and consistently pull standardized reports saves time and improves data quality for employees.

This conversation often starts with compliance, but very quickly, it becomes an even more interesting discussion around how to better fulfill your mission because data is more transparent and easily accessible.

Building a Healthy Enterprise Architecture

  • Make a business case for investment: Identify specific problems and expected outcomes that enterprise architecture will address. Track how much time employees spend on pulling reports. Evaluate the risks or costs associated with not providing reports in a timely fashion.
  • Evaluate business needs: Asking the right questions will help prioritize what data needs to be visualized and at what level of detail.
  • Assess your current enterprise architecture: How is the enterprise architecture program currently defined within your organization? Where is it in your organizational chart? How many resources does your enterprise architecture program have or need? How is it integrated with your current governance processes? Are they only in compliance or do they have a seat at the table when budget decisions are made? What types of interactions do they have with other departments? There are enterprise architecture maturity models for assessment and goal setting in this area.
  • Assess your data management strategy: How many data repositories are there? Is there an open data policy? How is enterprise data currently managed—is it standardized or stove-piped? Are data repositories well-maintained and well-governed?
  • Create open communication between stakeholders: Making advances in using data for decision making for an organization involves considerable user outreach. It is important to let people, especially leadership, know this data exists. Ensuring dashboards continually improve requires an ongoing interaction between data visualization people and developers. Things are changing all the time either due to legal changes or lessons learned.

A Healthy Enterprise Architecture Can Spur Innovation

When an organization improves its data quality, there are more opportunities to use data in innovative ways. Clean repositories with easy hooks or an application programming interface (API) can allow developers to support new applications that never would have been possible if the organization did not build a foundation of well-managed data.

LMI works with federal agencies and private industry on enterprise architecture best practices every day. We know which tools provide the reports needed for compliance and have an eye for open-source tools that do not require additional acquisitions. Please contact jnorton@lmi.org to discuss further.

***

LMILogoJoseph Norton is a member of LMI’s Information Management group. He helps federal agencies develop and communicate their IT strategy, develop enterprise architectures, and modernize complex information management systems. He has a B.S. in chemistry and computer science from the University of Miami and a Ph.D. in chemistry from the University of California, Los Angeles.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Transp Data Summit Logo - TransparentThe 2017 Capital Data Summit is just over TWO weeks away on February 15, 2017 and will be held at The Ritz-Carlton, Tysons Corner. Just today, we launched an exciting new afternoon panel session: Help Wanted: Defining and Finding Data Analytics Talent.

The demand for workers skilled in data analytics continues to grow and big data companies are struggling to find qualified employees with the right skillsets for their open positions. This panel will explore the qualifications required in a big data workforce, discuss how and where they currently find their employees and share best practices for sourcing talent through academic partners.

The all-star panelists and moderator headlining the session include:

Panelists:

  • Bob Aldrich, Chief Financial Officer, Zoomdata
  • Dr. Ali Eskandarian, Dean, Virginia Science & Technology Campus and College of Professional Studies, The George Washington University
  • Steve Partridge, Vice President of Workforce Development, Northern Virginia Community College
  • Dr. John Zolper, Vice President, Research and Innovation, Raytheon

Moderator:

  • Stephen Barkanic, Senior Vice President and Chief Program Officer, Business-Higher Education Forum

We’re adding new content and speakers every week! View the latest Capital Data Summit agenda here.

Keep up with all of the Capital Data Summit action on Twitter with our hashtag: #CapitalData and by following @NVTC!

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS