In his guest blog, Earth Networks Chief Marketing Officer Anuj Agrawal shares an inside look into the power of environmental data and how it is making cities smarter, cleaner and more resilient. Agrawal will be speaking on the Smart Cities Panel at the 2017 Capital Data Summit taking place on February 15, 2017 at The Ritz-Carlton, Tysons Corner.


EN new color logoYou interact with environmental data on a daily basis. It’s there when you turn your smartphone on to silence your alarm. It’s on your TV when you sip your morning coffee. And it’s on the radio during your commute to work. Free weather data and other types of environmental information helps us pick out our outfits, time our commute and plan our after-work activities. But can it do more than that?

We here at Earth Networks think so. And so does the Smart City Council Readiness Program. All ten of the Smart Cities Council Readiness Challenge Grant finalists named energy or some form of the environment as their top priority. This is because weather and greenhouse gas (GHG) data play vital roles in making cities more livable, workable, sustainable and resilient.

Commercial-grade weather data provides more than just a weekly forecast. In fact, its diverse capabilities make it a key component for resilient cities. Weather data affects a city’s population and some aspects of the economy; offering insights that no other data set can provide. Advanced weather data feeds and historical data is easily integrated into predictive models that can provide cities with smart decision-support so that you can plan for both routine and severe weather events.

As people begin to move out of the suburbs and into the cities, city pollution is on the rise. The scientific community estimates that 70% of GHGs are generated in urban areas. GHG data is so important because if you can quantify the amount of emissions in your city, you can make steps towards controlling it. With GHG data, smart cities can develop smart policies to reign in those emissions and have local, accurate baseline data to support their initiatives.

Environmental data is critical for smart city development. Both weather data and GHG data offer insights that can make communities safer, healthier and, ultimately, smarter. To learn more about how weather and GHG data can both help mitigate financial, operational and human risk in smart cities developing across the world, don’t miss the Smart Cities Panel at the Capital Data Summit on February 15, 2017.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Big Data: An Essential Public Asset

February 6th, 2017 | Posted by Alexa Magdalenski in Capital Data Summit | Guest Blogs - (Comments Off)

In his Capital Data Summit guest blog post, District of Columbia Chief Data Officer Barney Krucoff discusses the District’s new data investment and data accessibility plans and the importance of leveraging big data in municipalities today. Krucoff will be a speaker on the Role of the CDO panel at the 2017 Capital Data Summit on February 15, 2017 at The Ritz-Carlton, Tysons Corner.


OCTO DCThe District of Columbia government has consistently blazed the trail to public data access. Whether real-time traffic patterns or invaluable health statistics, the data collected and managed by the District is an essential public asset. Just as important as our schools, roads and buildings, we would not have a functioning city without data.

Now under the leadership of Chief Technology Officer Archana Vemulapalli, the Office of the Chief Technology Officer (OCTO) more than ever before understands the need to leverage the District’s data investment and the significance of data accessibility. OCTO launched drafts.dc.gov, DC’s version of the Madison Project, to gather public comments and feedback from the public, tech and open government activists, civic groups, and government agencies.

Now, our comprehensive Open Data Policy represents a consensus of viewpoints, balancing safety, privacy and security concerns while mandating openness and transparency. The opportunity to share more data realizes Mayor Muriel Bowser’s commitment to use technology to innovate, increase transparency, and improve accountability across the government.

The final version of the Open Data Policy will modernize and augment the District’s central data catalog. In turn, the public, media, entrepreneurs and academics will gain greater access to a variety of data sets. And, to build on their feedback, stakeholder and resident input will be the barometer by which we measure policy success. These insights will allow OCTO to learn from successes and shortcomings while planning for a secure, yet transparent, future. Our collective knowledge and expertise makes us more effective and better positioned to become one of the most open jurisdictions in the country.

Learn more about D.C. OCTO here.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Embracing Big Data

February 2nd, 2017 | Posted by Alexa Magdalenski in Capital Data Summit | Guest Blogs | Member Blog Posts - (Comments Off)

The inaugural 2017 Capital Data Summit is less than two weeks away! Susan Burke, vice president, single family data delivery services at Freddie Mac, will be participating on the Role of the CDO panel at the Summit. In her guest blog post, Burke shares thoughts and questions leaders should consider when aligning big data activities with their organization’s business goals. 


Freddie-Mac-Web-LogoLike many enterprises, we at Freddie Mac are in the process of determining what big data means to us. This past year, we started our journey to expand from the traditional rationalized data stores to the world of unstructured data and new technologies. One step on that path was considering a Hadoop environment. Would it bring value? What business problems would it solve?

The first lesson we learned is one that we tend to see repeated in the IT-business world. In our case, we raced to develop a new technology. Why wouldn’t we, when there were “obvious” value propositions we knew we could deliver? It quickly became apparent that, while our initial proof-of-concept provided insight into what needed to change from a back-end engineering perspective, we had not aligned with the business. Without a strong business champion, technology for the sake of technology efforts is doomed.

Fortunately, a strong, respected business leader stepped up to garner support and help IT define the business-use cases that would deliver value to the organization. Off we went.

Now that we’ve implemented Hadoop, what does it mean? How do we support it? Where do the data scientists fit into this picture? The technology in the data landscape is changing fast — and it is evolving in ways that were unimaginable just a few years ago. That means IT organizations are changing. The skill sets needed are different. The delivery methods are different. The way we integrate into our existing environments is different. We need to ask, “What hypothesis do I want to explore?” instead of “What are the requirements?” And all change is hard.

We decided the most useful model for Freddie Mac is one that combines business and IT resources in one team, so we created our Big Data Center of Excellence (CoE). This CoE brings together dedicated resources to support the development of use cases, deliver data not currently available on the platform and measure value. The data scientists remain in the business areas and can concentrate on asking new questions and executing analytics and visualizations.

The rapidly-evolving world of big data is exciting, and both IT and the business are in it together. We will continue to partner closely with our business leaders to identify the most impactful structure to help us evolve to a data lake structure.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week’s guest blog is by Joseph Norton, Consultant, Information Management at LMI. Norton shares strategies for improving your organization’s enterprise architecture to better fulfill your mission.

Modern Digital TechnologiesEnterprise architecture improves how organizations develop their strategic plans, make investment decisions and establish effective enterprise governance. Federal agencies can use enterprise architecture to make their operations more efficient as well as promote strategic and innovative initiatives.

Laws, such as the Federal Information Technology Acquisition Reform Act (FITARA), require federal agencies to provide more transparent reporting on IT portfolios. These laws are so new, many best practices on IT portfolio reporting are still being developed.

LMI works with the General Services Administration (GSA) to better connect its budget and IT portfolio management processes to help with its reporting. This approach helps GSA answer the following questions:

  • How do we decide when to enhance, migrate, or retire our applications?
  • What functions do those applications support and what are their life cycle costs?
  • What technology standards are approved for use?
  • How should we introduce new or emerging technology?
  • Are we engaging with all of our stakeholders on the right projects and at the right time?

In many cases, the ability to rapidly and consistently pull standardized reports saves time and improves data quality for employees.

This conversation often starts with compliance, but very quickly, it becomes an even more interesting discussion around how to better fulfill your mission because data is more transparent and easily accessible.

Building a Healthy Enterprise Architecture

  • Make a business case for investment: Identify specific problems and expected outcomes that enterprise architecture will address. Track how much time employees spend on pulling reports. Evaluate the risks or costs associated with not providing reports in a timely fashion.
  • Evaluate business needs: Asking the right questions will help prioritize what data needs to be visualized and at what level of detail.
  • Assess your current enterprise architecture: How is the enterprise architecture program currently defined within your organization? Where is it in your organizational chart? How many resources does your enterprise architecture program have or need? How is it integrated with your current governance processes? Are they only in compliance or do they have a seat at the table when budget decisions are made? What types of interactions do they have with other departments? There are enterprise architecture maturity models for assessment and goal setting in this area.
  • Assess your data management strategy: How many data repositories are there? Is there an open data policy? How is enterprise data currently managed—is it standardized or stove-piped? Are data repositories well-maintained and well-governed?
  • Create open communication between stakeholders: Making advances in using data for decision making for an organization involves considerable user outreach. It is important to let people, especially leadership, know this data exists. Ensuring dashboards continually improve requires an ongoing interaction between data visualization people and developers. Things are changing all the time either due to legal changes or lessons learned.

A Healthy Enterprise Architecture Can Spur Innovation

When an organization improves its data quality, there are more opportunities to use data in innovative ways. Clean repositories with easy hooks or an application programming interface (API) can allow developers to support new applications that never would have been possible if the organization did not build a foundation of well-managed data.

LMI works with federal agencies and private industry on enterprise architecture best practices every day. We know which tools provide the reports needed for compliance and have an eye for open-source tools that do not require additional acquisitions. Please contact jnorton@lmi.org to discuss further.

***

LMILogoJoseph Norton is a member of LMI’s Information Management group. He helps federal agencies develop and communicate their IT strategy, develop enterprise architectures, and modernize complex information management systems. He has a B.S. in chemistry and computer science from the University of Miami and a Ph.D. in chemistry from the University of California, Los Angeles.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

New year, new tech trends…especially in HR! Insperity Technology Solutions Analyst Lindsey Moreno discusses 2017′s HR trends and the HR latest software to help your organization stay ACA and FLSA compliant.


insperity-og

If you’ve still got someone entering timesheet data by hand, 2017 is the year to automate your HR processes.

Once upon a time, businesses with 50-150 employees could manage benefits and payroll through a series of non-automated, disjointed programs. It was simpler to stay in compliance with state and federal regulations related to employment back then.

Well, those days are gone. The Affordable Care Act (ACA) and the Fair Labor Standards Act (FLSA) set out a complex series of rules for how businesses measure eligibility for benefits and what constitutes a full-time employee.

The good news is that 2017 promises to offer several HR software products to help small and middle market businesses stay compliant. Furthermore, if you get the right systems in place, it can be a “set it and forget it” situation.

Managing the complex

You heard a lot of talk about the FLSA in 2016. The new overtime regulation would have changed overtime eligibility for white collar, salaried workers earning less than $913 a week ($47,476 a year).

Even though a court temporarily halted implementation, time and attendance systems are always needed because you must track working hours and maintain accurate data on your employees. The latest software improves your ability to address this need, resulting in easier federal and state compliance.

Benefits administration is more complex, too

Further, in 2016, many businesses also struggled with the role that the ACA played in determining who is considered a full-time employee and the benefits to which they’re entitled. Now, the minimum value of those benefits and the cost of those benefits to employees must be evaluated and tracked.

It’s no longer easy to designate who is “full time” and “part time.” Depending on a variety of factors, part-time employees may work enough hours to become “full-time equivalents.”

It’s too complicated to “wing it” as you keep track of exempt and non-exempt, salaried and hourly employees, and who is eligible for overtime pay and various levels of benefits. Without automation, you could easily have one full-time person doing nothing but tracking and measuring who qualifies when for which benefits based on their full-time or part-time status.

Automation simplifies this process and allows managers to run reports to support planning and manage costs. For example, a report could provide information on whether it’s more affordable for your business to pay for higher benefits required by law, or pay the penalties for offering a lower level of benefits.

In addition to calculating who is eligible, systems can automatically identify “minimum essential coverage” and “minimum essential value” as well as provide insight into the affordability requirements laid out by the ACA.

Finally, integrated HR software should also help you gather the data you need to file government paperwork at the end of the year.

Integration, convenience and mobility are key

When choosing your new HR software, integration is key. You’ll want a time and attendance system that feeds data to your payroll and benefits systems because the new regulations mean hours worked over a certain period impact pay and benefits.

When you’re considering software that will be regularly used by your employees, it can be extremely beneficial to choose a program that offers a smartphone app. Since most people carry their phones everywhere, such apps make simple transactions easy, such as checking paid time off or official company holidays, looking up a recent pay stub, etc.

For the people who are responsible for the management and administration of your HR, find software backed by easily accessible technical support.

Another service to look for is software backed by a support website or call center.

Particularly if your company is big enough to be affected by the ACA and FLSA, but not yet big enough to need more HR staff, a support website can help answer questions your staff may have about COBRA, sick leave or payroll compliance.

Insperity is one of many companies that offer software to integrate and automate payroll, time and attendance, and benefits, as well as mobile apps backed by a call center for additional support.

Self-service still pays

Electronic benefits systems are a must to remain compliant, but these tools can also help streamline your HR department.

In the past few years, more and more companies have been moving to self-service benefits administration. This trend still holds since many smaller companies have yet to adopt HR software that allows employees to electronically enroll, make changes to their benefits or find out how much time off they have banked.

The benefits of employee self-service are many:

  • It relieves the burden of HR employees keying in data.
  • It improves accuracy since the people who know their data best, the employees, enter all details and can make changes as needed.
  • Employees can check their pay stubs, tax information and PTO balances without involving their manager or HR representative.
  • It improves the filing, monitoring and payment of expenses.
  • It speeds up onboarding since new employees can complete employment documents online, before their first day.

Ideally, your new software package should keep you compliant, help you manage your workforce and give you the data you need to strategically plan for the future.

Want more tips for staying compliant with employment regulations? Download Insperity’s free e-book, 7 Most Frequent HR Mistakes and How to Avoid Them.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

insperity-ogThis week’s NVTC member guest blog post is by Kelly Yates, vice president of Service Operations at Insperity. Yates shares five key priorities outlined in the Equal Employment Opportunity Commission’s (EEOC) recently-released Strategic Enforcement Plan.

The Equal Employment Opportunity Commission (EEOC) recently released its Strategic Enforcement Plan through the year 2021, which outlines its priorities for the coming years.

In this plan, one of the areas of focus is a grouping the agency refers to as “emerging and developing workplace issues” – an ever-evolving subject, which can be challenging to navigate. From qualification standards and inflexible leave to discriminatory practices, typically, these complaints parallel societal issues that are also gaining a bigger spotlight in the media.

Here are five areas employers should watch carefully as they start their new year:

1. Discrimination against those with disabilities

The EEOC continues to broaden its definition of what constitutes a disability. Under the EEOC’s new plan, the agency is narrowing its focus to give priority to the areas of job qualification standards and inflexible leave policies to better protect disabled employees.

Qualification standards

A qualification standard is how the employer defines who is qualified to be hired for a particular job. This means that job descriptions could fall under scrutiny if there are unnecessary physical qualifications placed on applicants.

For example, if your job description says that applicants must be able to lift 20 pounds, but there is an accommodation that could be provided where lifting wouldn’t be required, it may not be reasonable to use this as a qualifier when making hiring decisions.

Inflexible leave policies

There is no clear cut definition of whether a leave policy is flexible or inflexible. But typically, any policy that takes a definitive stance on the time limitations of a leave will be labeled as inflexible by the EEOC.

For example, an inflexible policy might state: “You must be 100 percent healed to return to work when a 12-week FMLA leave ends, or you will be terminated.”

Leave policies require careful attention to the language used to ensure that they cannot be interpreted as inflexible and have a one-size-fits-all approach. And, these situations require an interactive dialogue between employers and employees to understand what return-to-work accommodations may need to be made, such as granting additional time off work or the ability to telecommute for a specified time period.

Even a policy that says that employees aren’t allowed to work remotely could be deemed as inflexible, if, based on their disabilities and the nature of their job duties, it would be a reasonable accommodation to allow them to work from home.

This is one of the biggest risk areas for employers. It’s one of the most challenging to work through because there is not a definitive manual on the topic. Each situation has to be handled individually and based upon its own merits – not based upon how the last case was treated.

2. Accommodating pregnancy-related limitations

Gender equality in the workplace has been in the spotlight in recent years. One area that the EEOC has weighed in on heavily has been removing pregnancy as a barrier for equal treatment in the workplace.

Employers are advised to treat pregnant employees as they would treat any other employee who has any other temporary medical condition. Consider all accommodation requests by engaging in an interactive dialogue with the employee and make necessary job modifications as specified by the employee’s medical provider.

For example, accommodations may include altered breaks and work schedules (e.g., breaks to rest or use the restroom), permission to sit or stand, ergonomic office furniture, shift changes, elimination of marginal job functions and permission to work from home for a specified time period.

If a woman isn’t able to work during her pregnancy, she may be eligible for time off not only under FMLA, but also under the Americans with Disabilities Act (ADA). Pregnancy complications may be covered under the ADA as a temporary disability.

These types of accommodations help to keep barriers neutralized so that women can continue to progress in their careers while pregnant. And, these accommodations can also help ward off potential discrimination claims.

3. Protecting LGBT from discrimination based on sex

If you look up the Civil Rights Act of 1964, you won’t see LGBT as a category of protection. But the agency’s recent interpretation has been that it falls into the broad category of discrimination based on sex.

To avoid EEOC discrimination charges, you must treat LGBT employees as you would all other employees. If employees come to you with complaints, make sure to treat the complaint seriously, and ensure that a prompt and thorough investigation is undertaken.

Share the outcome of the investigation with the employee, and discuss resolution options. Then, be sure to follow up with the employee regularly after the issue is resolved to ensure that no new concerns have arisen. This also conveys to your employee that you have an open door to ongoing communication.

Work toward making sure that your workplace is a comfortable environment for all employees.

4. Clarifying the employment relationship and workplace rights

Temporary workers, staffing agencies, independent contractors and the on-demand economy are all changing the dynamic of the employer/employee relationship.

For example, let’s say you hire five temporary workers from a staffing agency. The agency pays these employees so they aren’t your “employees of record” for purposes of your payroll taxes, etc. However, because these workers are conducting work on your premises and for your business, including engaging with your employees, as an employer, you may be deemed to be controlling the “terms and conditions of the employees” work environments. Therefore, you must ensure that these workers are treated in compliance with EEO laws, just as you would for all employees.

Say, for instance, one of your employees or managers harasses a temporary agency worker. The temporary worker could file an EEOC claim against your business, even though you’re not his or her employer of record. And, if the agency deems that the worker’s complaint has merit, you could be liable to make financial settlements to resolve the complaint.

Just because workers aren’t on your payroll as full-time, permanent employees, it doesn’t mean you are absolved of responsibility for ensuring that they are treated fairly under the law.

5. Addressing discriminatory practices against Muslims

In the midst of global terrorist attacks, the EEOC believes there may be an increase in discriminatory actions against people who are Muslim, Sikh, or of Arab, Middle Eastern or South Asian descent (or those perceived to be a part of those groups).

In an effort to get out in front of this, the EEOC is giving priority consideration to these cases.

First and foremost, hiring managers must understand that national origin or religion should never be used as a factor in any hiring or employment decision.

You may think that everything is fine because you aren’t hearing any complaints or concerns from employees. But often, employees are hesitant to file complaints because they think doing so will jeopardize their jobs. In these cases, they may wait to bring forward a concern until it has escalated to a point where they have already decided to involve an outside agency or attorney.

Your managers should take extra steps to vocalize and demonstrate their commitment to the company’s open door policy so that employees aren’t afraid to come to them with their concerns.

Also, ensuring that you regularly conduct discrimination and harassment prevention training for all employees and that you have clear policies with a zero-tolerance for discrimination and harassment in-place are important preventative steps for employers to take.

Understand the risks and move forward thoughtfully

While the EEOC provides information for employers on their website, employers are often confused as to how each law and regulation should be interpreted and applied to their specific workplace. As you navigate through issues in these areas, it is advisable to work closely with HR experts, legal counsel or a professional employer organization (PEO), who are well-versed and experienced at preventing and resolving workplace complaints.

Keep in mind that if an EEOC charge is filed and the agency begins an investigation of your company, it could quickly become a huge disruption to your employees and to your business overall.

Here are a few possible outcomes:

  • On average, it takes at least a year for a charge to be resolved.
  • There can be disruptions if EEOC investigators come on-site to interview employees and review your company’s documents.
  • The EEOC doesn’t have to notify your business when it contacts your employees or former employees for interviews.
  • The investigation can impact the morale and productivity of your workforce.
  • There are huge monetary impacts for having to defend against and resolve the complaint.
  • A “cause finding” by the EEOC can become a matter of public record, which impacts your reputation and credibility as an employer.
  • If the EEOC finds cause for a complaint, typically, your business will have to routinely provide reports to the agency and comply with ongoing requirements, such as employee training – usually for three to five years.
  • If the EEOC finds cause for a complaint, you may be required to post notices throughout the workplace to inform employees that your company was found to have engaged in a discriminatory practice.

If your business is charged in one of the areas from the strategic enforcement plan, it’s important to take it seriously and respond appropriately. Consider consulting with an HR expert or employment counsel on how to proceed.

Learn more about how to protect your business by downloading Insperity’s complimentary e-book, Employment Law: Are You Putting Your Business at Risk?

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

This week’s guest blog post is by Norm Snyder, partner at Aronson, LLC. Snyder is also chair of NVTC’s Small Business and Entrepreneur Committee. Snyder shares highlights and lessons learned from the Committee’s All Star Seed/Early Stage Investor Panel that took place on Nov. 15.

aronson-llc1 v2Can seed, early stage and angel investment capital be found in the D.C. metro area? This question and others were discussed by NVTC’s Small Business and Entrepreneur Committee’s engaging All Star Seed/Early Stage Investor Panel on Nov. 15, with some of the area’s most active early stage investors.

Moderated by Aronson Partner Norm Snyder, the panelists included Ed Barrientos, “super-angel” investor and entrepreneur CEO of Brazen, Steve Graubart, CFO of 1776, John May, founding partner of New Dominion Angels, Liz Sara, angel investor and entrepreneur and chair of the Dingman Center, and Tom Weithman, managing director of CIT GAP Funds and CIO of Mach37.

During the event, panelists discussed their recent experiences, desired investee profiles and offered practical advice to an audience of start-up entrepreneurs engaged in navigating the challenging early stage investment world. While the general consensus is that early stage capital is available in the D.C. metro area, it takes persistence and hard work for entrepreneurs to successfully attract sufficient investment from the right investors.

According to Weithman, over 100 companies have been funded in Virginia by CIT with a focus on tech, fin-tech, cyber and life sciences. However, he stated there is a dearth of seed funds generally available for cyber. Sara stated that approximately 15 deals were funded in the last year by her Dingman Angel group. Barrientos has made significant angel investments in a number of companies and has raised venture capital funds for Brazen. May stated that almost every deal that should be funded is funded, but it is rare for one angel to fund the entire deal. Graubart said 1776 has made 30 investments to-date with a focus on regulated industries such as ed-tech, health IT, fintech, smart cities and transportation.

So how does an investor stand out in the crowd of early stage companies?

Panelists offered a range of suggestions. Research potential investors – plenty of information is available to find out what they are interested in. Don’t waste your time and theirs chasing investors not interested in your company’s profile. For early stage, investors are betting first on the entrepreneur and their team and not on a single idea or concept, which is likely to evolve several times before it goes to market. Put together a passionate team with strong domain experience and the ability to sell themselves to attract investors, customers and future team members. Remember, the team should include an experienced advisory board with strengths and experiences that compliment and extend the abilities of the entrepreneurs. Put together well thought out and concise pitches and applications.

Be persistent – get in front of groups of investors. Warm referrals tend to get looked at first, so use your advisors to help you get noticed and invest time building relationships. Be able to demonstrate market acceptance and traction. Be coachable; you may be the “master” of your technology, but each successful start-up faces different challenges and there’s a lot to learn. Early stage entrepreneurs shouldn’t focus on trying to get the “highest” valuation – high valuations can scare away very qualified investors and may lead to future disastrous down rounds. Convertible debt, instead of preferred stock, can help take the focus off the subjective valuation issue for early stage companies.

Most importantly, the closing advice to attendees: be passionate and persistent and make sure you enjoy what you do!

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Our latest NVTC member guest blog post is by ePlus Chief Security Strategist Tom Bowers. Bowers discusses the latest advancements in machine learning and its impact on cybersecurity.

eplusAccording to a Ponemon Institute study released in March, 63% of survey respondents said their companies had been hit by “advanced (cyber) attacks” within the last year. Only 39% felt their company was highly effective at detecting cyber attacks. And worse, only 30% considered their organizations highly effective at preventing them.

A few weeks ago, I moderated a panel discussion at the ePlus/EC-Council Foundation CISO Security Symposium in National Harbor, Md. Our purpose was to gather together leading security experts to get their insights on the latest security threats and to discuss ideas and strategies. CISOs from many different industries were there. And as you might imagine, given the importance of cybersecurity today, the event was well-attended.

During the session, we covered various pressing topics in the realm of cybersecurity. But the most intriguing “future-looking” trend we discussed was machine learning.

That’s not a surprise because machine learning is a hot topic in tech circles. But it’s more than just the latest buzzword in the industry, and vendors are responding accordingly. In March, Hewlett Packard Enterprise (HPE) announced the availability of HPE Haven OnDemand, their cloud platform “machine-learning-as-service” offering. In October, IBM, whose Watson system is known as a leader in artificial intelligence (AI), changed the name of their predictive analytics service to “IBM Watson Machine Learning” to emphasize their direction “to provide deeper and more sophisticated self-learning capabilities as well as enhanced model management and deployment functionality within the service.”

Simply speaking, machine learning refers to the ability of computers to, in effect, “learn and grow in knowledge” based on past experience. Machine learning begins with a base set of teaching material and through subsequent experiences (i.e. the processing of more and more data sets and responses), the machine learning algorithm adds to the base material—it’s body of knowledge, so to speak—and the program becomes more intelligent. As a result, machine learning programs are able to answer questions and to make predictions with increasing accuracy.

What are the implications for security operations?

Machine learning has made tremendous strides in the last few years. From self-driving vehicles to medical research to marketing personalization to data security, machine learning algorithms are being used to churn through huge stores of data to identify patterns and anomalies, enabling data-driven decisions and automation. And that capability continues to mature and extend into the area of cybersecurity.

For years, those of us in IT security have worked tirelessly to increase the maturity of security operations in our companies. We’ve strived—in the face of increasing complexity and rising threats—to advance our information security capabilities beyond simple “detect and respond” reactive methods to risk-based “anticipate and prevent” proactive approaches. Machine learning is playing a role in that mission today and will play an even larger part in the years to come.

As more security vendors incorporate machine learning engines into their solutions, security operations will change. For example, log scanning—a tedious, labor-intensive effort—will become automated. Instead of a security analyst scrolling SIEM output, scrutinizing correlated events and analyzing their meaning, machine learning engines will parse huge log files, identify anomalies, and make decisions in near real-time.

In addition, machine learning engines will identify trends, threats, and incidents much faster. Instead of waiting on a security analyst to conclude their analysis, machine learning engines will parse reams of security data collected from enterprise machines, such as servers, smartphones, tablets, network devices, applications, and others. Through big data analytics and machine learning, this machine data will be searched and analyzed to gain insight into what is happening inside corporate networks, enabling trends to be exposed and incidents to be identified much faster than they are today.

But more importantly, machine learning engines will be able to “hunt” for exploits. By combining input from learned behaviors, known indicators of compromise (IOCs), and external threat intelligence feeds, machine learning engines will be able to predict malicious events with a high degree of accuracy, preventing major incidents before they materialize or become widespread problems. And we are seeing examples of this capability today. For instance, the cyber solution Endgame operates at the microprocessor level, analyzing pre-fetch instruction cache searching for zero-day exploits so they can be detected and eliminated long before an incident occurs.

Not to be overlooked is the ability of machine learning to enable automated responses. Machine learning engines not only can detect malicious behavior faster, based on IOCs and “experience,” but also can take action to eliminate the threat early in the kill chain without requiring human involvement. This enables incidents to be avoided proactively and lessens the workload on short-handed staff.

The benefits of machine learning are clear and compelling. But many security professionals are asking, “Is the technology really ready?” There are valid concerns, such as the validity of data from external threat intelligence feeds into machine learning engines and the potential for machine learning algorithms to be attacked and fed false models, but work continues by vendors and academia alike to sort out those questions. In fact, Georgia Institute of Technology just launched a new research project to study the security of machine learning systems.

Like most technology, machine learning will continue to evolve. But if expectations prove out, machine learning will transform how CISOs manage security operations within the next three years.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Did you know?

  • Only 7% of federal employees today are age 30 or under – the lowest percentage in the last ten years
  • By 2017, 31% of federal workers will be eligible to retire
  • The government loses about 5,000 information technology employees each year

In a recent Government Executive blog post, NVTC member Susan Fallon Brown of Monster Government Solutions shared these astounding statistics and highlighted the growing opportunity for the federal government to bolster its millennial workforce and reduce overall hiring gaps with millennial talent. Here are some of the key themes she shared in the blog:

  • The importance of federal agencies being able to articulate their missions – millennials want to be a part of organizations that serve the greater good; an agency’s mission statement, often the first point of entry into an organization for a candidate, must clearly express the positive impact the agency is making
  • Digital channels are key to millennial recruitment – millennials are using social networks and digital channels in their job search more than ever before; agencies should leverage their digital channels as an extension of their recruitment efforts, utilizing clear and enticing messaging
  • Transparency and engagement are a must in the recruitment process – millennials want to be continually engaged in the hiring process. They want feedback from recruiters at all stages of the hiring process – and to hear from recruiters after the interview process, even if they didn’t get the job

Millennials make up about one-third of the workforce in Fairfax and Arlington Counties according to a 2016 Millennial Research report conducted by NVTC’s NextGen Leaders Committee. The report explored what attracts and retains millennials in organizations in Northern Virginia.

The notion of connection – millennials’ desire to feel connected to the community they live in, to their employer’s mission and charitable efforts, and to their colleagues, emerged throughout the report. Here are some interesting points from the research:

  • Millennials place strong emphasis on flexibility in their positions – in their schedule, in the physical location of their job and in their responsibilities. Instead of the amount of hours they work, millennials want to be evaluated on the quality of their output.
  • Millennials place strong value on ongoing learning and development opportunities; career progression and mentorship is highly important, even though company loyalty isn’t always a driving career factor for millennials.
  • Millennials highly value employee recognition in a variety of forms, including constructive feedback, awards, perks and promotions.
  • A company’s social responsibility efforts and commitment to being ethical is critical for millennials and a driving recruitment factor; millennials place strong value in the trust they have for their employer, their transparency and commitment to bettering the world.

Interested in learning more about recruiting and retaining millennials in our region? Read the full NextGen Leaders Millennial Research report.

Check out Government Executive’s blog here.

NextGen Leaders Millennial Graphic

Click to enlarge infographic above – just one of the interesting infographics you’ll find in the NextGen Leaders Millennial Research report

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

leaseweb-logoThis NVTC guest blog post is written by Marc Burkels, manager of dedicated servers at LeaseWeb. LeaseWeb, an NVTC member company, is an Infrastructure-as-a-Service (IaaS) provider offering dedicated servers, CDN and cloud hosting on a global network. LeaseWeb recently exhibited at the Capital Cybersecurity Summit on Nov. 2-3, 2016.

Let’s say you want to become the new Facebook. Believe it or not, I regularly run into people who have this ambition. The number one question these new Mark Zuckerbergs ask me is which server they need.

It is always a challenge to convince them to not rush into anything. Instead, I have them sit down and tell me what they really want. Since many companies switch servers within a few months after buying and this is always time consuming (not to mention the costs), it is certainly worth your while to think well before you decide. What is the service you want to deliver? What is your workload? Does it involve large databases?

I always discuss the following 8 things to help people decide on the right hosting provider and hardware configuration of a dedicated server:

1. Business impact of downtime

What is the business impact of potential failure of your hosting environment? One of the first things to consider when selecting a dedicated server is how to deal with potential downtime. In a cloud environment, the setup of the cloud protects you against hardware failures. With a dedicated server, you know you are not sharing resources with anyone else. But since there is always a single point of failure in one server, you need to decide whether you are able to accept potential downtime – if you do not have the option to scale to multiple dedicated servers.

2. Scalability of your application

Scalability is another important issue when choosing a dedicated server. How well does your application scale? Is it easy to add more servers and will that increase the amount of end users you can service?

If it is easy for you to scale, it doesn’t matter whether you use a dedicated server or a virtual solution. However, some applications are difficult to scale to multiple devices. Making sure a database is running on multiple servers is a challenge since it needs to be synchronized over all database servers. It might even be easier to move the database to a server that has more processing capacity, RAM and storage. Moving to a cloud environment – where you can clone a server, have a copy running in production and can add a load balancer to redirect traffic to multiple servers – could also be a good option for you.

3. Performance requirements of your server

What are your performance requirements? How many users do you expect and how many servers do you potentially need? Several hardware choices influence server performance:

Processor/CPU

Generally , you can choose the amount of processors and cores in a server. It depends on the application you are running whether you will benefit from more cores (but any multi-threaded application will benefit from more cores, for instance web servers or database servers). Consider also the performance of the core defined in clock speed (MHz): some processors have a better turn-around time with less cores and more GHz per core. The advice on which processors and how many cores to choose will ideally come from someone who is managing the application or the vendor of the software. Of course, they need to also take into account the expected amount of users.

RAM

The faster the CPU and the more cores it has, the more RAM options are available to you. If you are unsure about your RAM needs, choose a server that allows you to add RAM if needed since this is relatively easy. The ranges of RAM choices, especially with double processors, are enormous.

The size of your server is important when choosing RAM, as is the latest technology. Current generation servers use DDR4-technology, which could have a positive effect on database performance. DDR4 is priced interestingly nowadays, since it is the standard.

Hard Drives

Choose a RAID set-up for your hard drives, so you are well protected against the failure of a single hard drive. Your system will still be up and running – with some performance loss – until the hard drive is replaced.

The larger the server, the more hard drive options you have. SATA drives stand for high volume but relatively low performance. SAS performs twice as well as SATA, but has a higher price and lower capacity. SAS has been succeeded by SSD, which is 50 to 100 times faster than SATA.

4. Load balancing across multiple dedicated servers

If your application can scale across multiple dedicated servers, a form of load balancing where end users are split across all available servers- is necessary. If you are running a website and traffic is rising, at some point you will need to use multiple web servers that serve a multitude of users for the same website. With a load balancing solution, every incoming request will be directed to a different server. Before doing this, the load balancer checks whether a server is up and running. If it is down, it redirects traffic to another server.

5. Predictability of bandwidth usage

The requirements in bandwidth naturally relate to the predictability of data traffic. If you are going to consume a lot of bandwidth but predictability is low, you could choose a package with your dedicated server that has a lot of data traffic included, or even unmetered billing. This is an easy way of knowing exactly how much you will be spending on the hosting of your dedicated server.

6. Network quality

As a customer, you can choose where a dedicated server is placed physically. It is important to consider the location of your end user. For instance, if your customers are in the APAC region, hosting in Europe might not be a sensible choice since data delivery will be slow. Data delivery also depends on the quality of the network of the hosting provider. To find out more about network quality, check a provider’s NOC (Network Operation Center) pages and test the network. Most hosting providers will allow you to do this.

7. Self-service and remote management

To which degree are you allowed to manage your server yourself? If you are running an application on a dedicated server, you probably have the technical skills and the knowledge to maintain the server. But do you have access to a remote management module? Most A-brand servers are equipped with remote management modules. Providers can allow you secure access to that module.

A remote management module can also help if you are in a transition from IT on premise to a hosted solution (perhaps even a private cloud solution). It can be an in-between step that will leave existing work structures intact and ease the transition for IT personnel, since they will still be able to manage their own software deployments and the customized installation of an operating system.

8. Knowledge partner

And last but definitely not least: make sure your hosting provider involves his engineers and specialists when trying to find a solution tailored to your needs. A true knowledge partner advises on best practices and different solutions. This may involve combining different products into a hybrid solution.

The above will probably give you a good idea of what to consider before renting a dedicated server. If you are looking for specific advice or need assistance, please feel free to contact the LeaseWeb team. They can help you find the solution that is right for you.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS