securitylinkindia

The Markets and Markets Report

‘AI’ in Transportation Market to grow to USD 10.30 Billion by 2030 The report ‘Artificial Intelligence in Transportation Market’ notes that the artificial intelligence in transportation market is projected to grow at a CAGR of 17.87% with the market size expected to grow from USD 1.21 Billion in 2017 to USD 10.30 Billion by 2030. The emergence of autonomous trucks and industry-wide standards such as adaptive cruise control (ACC), blind spot alert, and advanced driver assistance systems (ADAS) would trigger the growth of the artificial intelligence in transportation market. The growing demand for safety and security has created an opportunity for OEMs to develop new and innovative artificial intelligence systems that would attract customers. Software segment holds the largest share The software segment is estimated to hold the largest market share of the artificial intelligence in transportation market in 2017. In recent years, major developments have occurred in AI software solutions, platforms and related software development kits. These developments have taken place due to the need for advancement in the areas of object perception, machine translation and object recognition. Companies such as Alphabet (US), Microsoft (US), IBM (US) and Intel (US) are among the frontrunners in the development of AI software. All these companies are acquiring or investing heavily in start-ups to maintain a strong position in the artificial intelligence in transportation market. In March 2017, Intel (US) acquired Mobileye (Israel) for approximately USD 15 billion. Software for deep learning technology has a major share in the AI in transportation market. Therefore, the need for advancement in autonomous trucks has led to a large amount of funding for various start-ups in the North American region. Data mining process is estimated to be the fastest growing segment of the artificial intelligence in transportation market from 2017 to 2030. A huge amount of data, collected from different sensors used in semi-autonomous or autonomous trucks can be used to train the trucks to detect or recognize images, obstacles and various scenarios one might encounter behind the wheel. Autonomous trucks have a potential of collecting more data from the outside environment, and hence, data mining will be of great importance. The North American region is estimated to have the largest market for data mining as the trucking industry has a huge influence on its economy. Similarly, the growing demand for predictive maintenance in transportation industry would boost the growth of the data mining process in the near future. North America be the leading market Transportation plays a significant role in the North American economy. According to American Trucker Association, there are 8.7 million truck-related jobs in the US, and the country still faces a shortage of truck drivers. Hence, major developments related to autonomous driving of trucks are taking place in this country. Additionally, incentives and high level of funding from the government plays a major role in the development of this technology. For instance, in 2016, the US government spent USD 4.00 billion to accelerate the acceptance of autonomous vehicles on US roads. Also, the Central North American Trade Corridor Association is planning to develop a driverless truck corridor from Mexico to Manitoba. Also, the region is home to leading technology companies such as Microsoft (US), Intel (US), and NVIDIA (US), which offer various AI technologies for vehicles in partnership with OEMs. Therefore, the North American region has a great potential for the growth of the artificial intelligence in transportation market. The key software & Tier- I suppliers profiled in the report include Continental AG (Germany), Bosch (Germany), Microsoft (US), and NVIDIA (US). These systems and technologies are supplied to automotive OEMs such as Volvo (Sweden), Daimler (Germany), Scania (Sweden), and others. Visual Analytics Market Worth USD 6.51 Billion by 2022 The report ‘Visual Analytics Market Global Forecast to 2022’ indicates that the visual analytics market is expected to grow from USD 2.57 Billion in 2017 to USD 6.51 Billion by 2022, at a Compound Annual Growth Rate (CAGR) of 20.4%. The major factors driving the visual analytics market are the rapid growth in volume and variety of business data, growing need for data-driven decision-making, and increasing demand for advanced analytics. The visual analytics market is growing rapidly because of the transformation from the traditional analytical techniques for analyzing business data to the advanced techniques and due to the massive surge of the flowing business data. IT business function The IT business function is expected to witness the highest CAGR during the forecast period because of the increasing need for IT departments across industries to have access to real-time analytics on Key Performance Indicators (KPIs) and the visualization of the percentage of helpdesk ticket requests resolved in the agreed time-frame to assess the success of on-going support. Transportation and logistics The transportation and logistics industry is expected to witness the highest CAGR during the forecast period because of the increasing need for managing the real-time data coming from various transportation and logistics activities, such as the vehicle’s schedule analysis and route analysis, based on the transportation and logistics activities conducted. North America is to dominate North America is expected to hold the largest share of the visual analytics market in 2017, due to the technological advancements and early adoption of analytical solutions in this region. The market size in Asia Pacific (APAC) is expected to grow at the highest CAGR from 2017 to 2022. The primary driving factors for this growth are the increasing technological adoption and huge opportunities across industries in the APAC countries, especially in India, China and Japan. The report also encompasses different strategies, such as mergers and acquisitions, partnerships and collaborations, and product upgrading adopted by the major players to increase their shares in the market. Some of the major technology vendors include IBM (US), Oracle (US), SAP (Germany), SAS Institute (US), Tableau Software (US), Microsoft (US), MicroStrategy (US), TIBCO Software (US), Qlik (US), and Alteryx (US).

Read More

Caution in Courtroom Evidence Presentation

Use of ‘Likelihood Ratio’ not consistently supported by scientific reasoning approach ( -: Contributed by NIST :- ) Two experts at the National Institute of Standards and Technology (NIST) are calling into question a method of presenting evidence in courtrooms, arguing that it risks allowing personal preference to creep into expert testimony and potentially distorts evidence for a jury. The method involves the use of Likelihood Ratio (LR), a statistical tool that gives experts a shorthand way to communicate their assessment of how strongly forensic evidence such as a fingerprint or DNA sample, can be tied to a suspect. In essence, LR allows a forensics expert to boil down a potentially complicated set of circumstances into a number – providing a pathway for experts to concisely express their conclusions based on a logical and coherent framework. LR’s proponents say it is appropriate for courtroom use; some even argue that it is the only appropriate method by which an expert should explain evidence to jurors or attorneys. However, in a new paper published in the Journal of Research of the National Institute of Standards and Technology, statisticians Steve Lund and Hari Iyer caution that the justification for using LR in courtrooms is flawed. The justification is founded on a reasoning approach called Bayesian decision theory, which has long been used by the scientific community to create logic-based statements of probability. But Lund and Iyer argue that while Bayesian reasoning works well in personal decision making, it breaks down in situations where information must be conveyed from one person to another such as in courtroom testimony. These findings could contribute to the discussion among forensic scientists regarding LR, which is increasingly used in criminal courts in the U.S. and Europe. While the NIST authors stop short of stating that LR ought not to be employed whatsoever, they caution that using it as a one-size-fits-all method for describing the weight of evidence risks conclusions being driven more by unsubstantiated assumptions than by actual data. They recommend using LR only in cases where a probability-based model is warranted. Last year’s report from the President’s Council of Advisors on Science and Technology (PCAST) mentions some of these situations, such as the evaluation of high-quality samples of DNA from a single source. “We are not suggesting that LR should never be used in court, but its envisioned role as the default or exclusive way to transfer information is unjustified,” Lund said, “Bayesian theory does not support using an expert’s opinion, even when expressed numerically, as a universal weight of evidence. Among different ways of presenting information, it has not been shown that LR is most appropriate.” Bayesian reasoning is a structured way of evaluating and re-evaluating a situation as new evidence comes up. If a child who rarely eats sweets says he did not eat the last piece of blueberry pie, his older sister might initially think it unlikely that he did, but if she spies a bit of blue stain on his shirt, she might adjust that likelihood upward. Applying a rigorous version of this approach to complex forensic evidence allows an expert to come up with a logic-based numerical LR that makes sense to the expert as an individual. The trouble arises when other people – such as jurors – are instructed to incorporate the expert’s LR into their own decision-making. An expert’s judgment often involves complicated statistical techniques that can give different LRs depending on which expert is making the judgment. As a result, one expert’s specific LR number can differ substantially from another’s. “Two people can employ Bayesian reasoning correctly and come up with two substantially different answers,” Lund said, “Which answer should you believe, if you’re a juror?” In the blueberry pie example, imagine a jury had to rely on expert testimony to determine the probability that the stain came from a specific pie. Two different experts could be completely consistent with Bayesian theory, but one could testify to, say, an LR of 50 and another to an LR of 500 – the difference stemming from their own statistical approaches and knowledge bases. But if jurors were to hear 50 rather than 500, it could lead them to make a different ultimate decision. Viewpoints differ on the appropriateness of using LR in court. Some of these differences stem from the view that jurors primarily need a tool to help them to determine reasonable doubt, not particular degrees of certainty. To Christophe Champod, a professor of forensic science at the University of Lausanne, Switzerland, an argument over LR’s statistical purity overlooks what is most important to a jury. “We’re a bit presumptuous as expert witnesses that our testimony matters that much,” Champod said, “LR could perhaps be more statistically pure in the grand scheme, but it’s not the most significant factor, transparency is. What matters is telling the jury what the basis of our testimony is, where our data comes from, and why we judge it the way we do.” The NIST authors, however, maintain that for a technique to be broadly applicable, it needs to be based on measurements that can be replicated. In this regard, LR often falls short, according to the authors. “Our success in forensic science depends on our ability to measure well. The anticipated use of LR in the courtroom treats it like it’s a universally observable quantity, no matter who measures it,” Lund said, “But it’s not a standardized measurement. By its own definition, there is no true LR that can be shared, and the differences between any two individual LRs may be substantial.” The NIST authors do not state that LR is always problematic; it may be suitable in situations where LR assessments from any two people would differ inconsequentially. Their paper offers a framework for making such assessments, including examples for applying them. Ultimately, the authors contend it is important for experts to be open to other, more suitable science-based approaches rather than using LR indiscriminately. Because these other methods are still under…

Read More

Protecting IP Surveillance Cameras

(DITEK Surge Protection) As intelligent, networked IP cameras become the industry standard, protecting them from damage and downtime has become a crucial part of any security system. IP cameras can deliver constant analytic information such as the routes customers take through a store, in addition to providing the round-the-clock surveillance necessary for protecting your business. Should an electrical surge event occur, unprotected cameras can easily become damaged and require maintenance or replacement, which is costly both in price and in downtime of critical systems. It is more necessary than ever to safeguard the consistent functionality of your IP cameras by installing surge protection. Analog cameras require separate functionality – a standalone system often managed outside of an established business network. IP cameras can communicate over your primary data network, and can easily integrate with other systems. This provides faster, simpler connectivity and higher performance, but also comes with higher risk. A surge event at one of your IP cameras – often outside and exposed to weather – can be introduced into your entire network, damaging or even destroying other systems and devices. Surge protection for your cameras has become essential risk prevention for your security system. Downtime on your system is a huge security risk. With IP cameras more able to cover large areas and process data, fewer cameras are required on a modern surveillance system to achieve good coverage. Downtime on any individual camera could therefore leave larger areas without coverage and present more of a threat to your system’s functionality. A surge event that causes even a single IP camera to go down could present incalculable loss and possible risk to your business. It is more important than ever to ensure protection from surge events on every individual device on your network, and prevent a catastrophic event, like a lightning strike, from damaging additional parts of your system. IP cameras provide a wide range of functions beyond video surveillance to protect your business and provide higher return on investment (ROI). Today’s best-in-breed devices can track customer behaviors, perform facial recognition, and allow managers to watch sales and provide customer service in real time. With more functionality on individual cameras, losses are costlier if they are damaged in a surge event. Should a camera go down, your business is losing more than video data – you are losing the real-time ability to optimize and grow. To prevent loss of important functionality, always have surge protection on your cameras and switches. IP cameras are often the most exposed part of your surveillance system because many are installed outdoors and exposed to weather. Though lightning strikes are not the most common instigator of surge events as imagined by most people, outdoor IP cameras are often equipped with metal housings and are in much more danger of conducting a surge to your network than other devices. Surge protectors should consider required protection when installing an outdoor IP camera. Surge protectors are designed to be cost-effective ways to protect your IP cameras. Self-restoring devices keep the surge protectors functional after a surge event, and most are designed to survive several power surges. Should a surge protector be destroyed in a surge event, it is simpler and far less costly to replace than an expensive IP camera. Most surges are not caused by lightning strikes – more often, surges are created from nearby or even from within the facility such as when a large HVAC system kicks on. Surge protection should be considered a simple preventative measure, necessary to protect your cameras – and as such, your entire business – from these surge events. As a business incurs risk, liability, and costly downtime should its surveillance system go down, it is crucial to protect these devices from damaging surge events. Modern IP cameras that are networked to your system present a high risk should they not have proper surge protection. When installing IP cameras onto your security system, the addition of surge protectors should be considered an essential feature to protect your business.

Read More

ONVIF Hosts Annual 2017 Member Webinar

ONVIF, the leading global standardization initiative for IP-based physical security products has recently hosted its annual membership meeting providing an update to members on the accomplishments of ONVIF in 2017 and its outlook for the year ahead. Attendees heard presentations on the final release of Profile A in 2017 and the ongoing development of Profile T for advanced video streaming, activities which are enabling the growing prevalence of the nearly 9,000 ONVIF conformant products in bid and specification processes in projects around the world. In the webinar’s opening remarks, ONVIF Steering Committee Chairman PerBjörkdahl highlighted collective achievements of ONVIF since its founding in 2008, including the continued inclusion of ONVIF in international standards from organizations such as the International Electrotechnical Commission (IEC), on its IEC TC 79 standard for video surveillance and physical access control standards. Björkdahl also cited the progress of ONVIF on a system-to-system approach for interoperability between various management systems, work which is happening in tandem with continued development of additional profiles to keep pace with the changes of the industry. Björkdahl also recognized the extensive contributions of three individuals to various ONVIF technical committees. Arsenio Vilallonga, vice president, Networked Systems, FLIR Systems, Inc. and Nicholas Brochu of Genetec, one of the most active participants in the development of Profile T, were recipients of the 2017 ONVIF Award, which recognizes individuals and companies who have made significant contributions to ONVIF. Sriram Prasad Bhetanabottla of Canon Inc., was recognized with the 2017 ONVIF Distinguished Service Award for his role as an active, contributing core member to the Device Testing activities of the Technical Services Working Group’s and Profile T. Hans Busch of Bosch, Chairman of the ONVIF Technical Committee, spoke to members about the specification development roadmap and its alignment to the standardization activities within the IEC TC 79 working groups for video surveillance and physical access control standards. Technical Services Committee (TSC) Chair Andreas Schneider of Sony gave an overview of the committee’s work on profiles, test tools, the conformance process and Developers’ Plugfests, and highlighted the new conformant product database launched earlier this year. ONVIF Communication Committee Chair Jonathan Lewit of Pelco by Schneider Electric followed the TSC with a recap of the internal and external communications of ONVIF in 2017, activities designed to promote the adoption of ONVIF Profiles.

Read More

Nice Actimize Debuts Autonomous Financial Crime Management

To significantly reduce costs of compliance and increase detection accuracy Autonomous Financial Crime Management addresses such complexities as changing regulatory requirements, mounting costs for personnel, and dramatic shifts in technology In the face of mounting pressures on compliance departments at financial services organizations, NICE Actimize, a NICE business, is leading the revolution to more efficient and cost-effective financial crime and compliance operations with the debut of Autonomous Financial Crime Management. Addressing a paradigm shift where machine-led functions are driving operations, which are today performed manually, this innovative process will create a unique environment that more effectively addresses the challenges and pain points that financial services organizations are facing by allowing them to tailor their operations to lower costs and drive greater profitability, all while improving accuracy and throughput. Autonomous Financial Crime Management also allows organizations to configure which decisions to direct to human experts, supporting either semi-autonomous to fully autonomous operations. By addressing such complexities as changing regulatory requirements, mounting costs for personnel, and dramatic shifts in technology, NICE Actimize’s Autonomous Financial Crime Management offers a unified view of risk through targeted utilization of big data, advanced analytics everywhere, artificial intelligence and Robotic Process Automation which enables these issues to be more effectively processed. Autonomous Financial Crime Management also streamlines and creates more productive use of personnel, therefore improving an organization’s overall effectiveness. NICE Actimize’s innovative Autonomous Financial Crime Management approach relies on its deep domain knowledge and expertise in financial crime and compliance. This new method creates a seamless connection to data from anywhere, from any source at any volume, to work quickly to turn raw data into intelligence. This acquired intelligence is then used to detect, decide, investigate and resolve alerts and cases with limited human intervention, enabling financial services organizations to mitigate various types of financial crime with greater speed and accuracy. Joe Friscia, President, NICE Actimize said, “Financial services organizations are facing a true paradigm shift. Where humans once drove and assisted machines to execute processes in financial crime management, the reverse is becoming true and machines are now driving operations, thereby creating dramatic gains in cost savings, creating vastly better models and improving detection accuracy. This rapid technology transformation, coupled with vast regulatory change, requires a new approach. NICE Actimize’s Autonomous Financial Crime Management leads the market and our customers into the future while providing a vision and strategy that streamlines the specialized operational requirements of financial crime fighting by unifying advanced analytics, machine learning and intelligent automation.”

Read More

Skybox’s Security Threat-Centric Vulnerability Management for Virtual and Cloud Networks

Expands solutions to enable organizations to consistently and securely manage day-to-day security processes across all networks in one platform Skybox™ Security continues to expand its cloud security management solution, Skybox for the Cloud™. The solution now includes threat-centric vulnerability management (TCVM) for virtual and multi-cloud environments and extends capabilities for security policy management, attack surface visibility and network path analysis. With one platform, the Skybox™ Security Suite, organizations are now able to consistently and securely manage day-to-day security processes across their entire network infrastructure, whether on premises or in the cloud. As businesses continue to migrate to virtual and cloud environments, security becomes more complicated due to the nature of cloud architecture, from multi-tenancy to elasticity and the shared responsibility for the computing stack. For example, security mechanisms in virtual and cloud networks are different from physical environments, and even differ amongst cloud service providers. In addition, cloud elasticity means virtual machines are quickly spun up and down, making traditional vulnerability scanning insufficient as the environment may change significantly between scans. To counteract these challenges and reduce the chance of human error, Skybox helps automate security processes not only in virtual and cloud environments, but across all networks within a single, unified dashboard. Whether an organization’s network infrastructure is physical, virtual, cloud or a hybrid of all three, Skybox for the Cloud addresses a number of use cases in the following areas: Comprehensive visibility of the attack surface in a single network model, capable of incorporating data from 120+ security and networking technologies. End-to-end path analysis from any source to any destination across or within physical, virtual and multi-cloud networks, including detailed path analysis on the devices, rules, etc., along the path. Unified security policy management across all networks including out-of-the-box compliance checks for key industry regulations such as NIST and PCI DSS. Vulnerability discovery, prioritization and remediation planning with the context of an organization’s on-prem and multi-cloud networks; this information is correlated to current threat intelligence of exploits in the wild. As cyber events like the Equifax breach (caused by the Apache Struts vulnerability) continue to increase, it’s obvious that organizations are struggling to quickly identify and effectively remediate vulnerabilities in their systems. This challenge can be compounded by the nature of cloud environments and even procedural requirements from service providers that impact third-party scans. Skybox for the Cloud gives security teams the power to assess vulnerabilities in the cloud on demand by combining data from cloud-based patch and asset management systems, scanners and network devices. The results are analyzed and prioritized using the TCVM approach, taking into account: The vulnerabilities on the virtual machine and its importance to the organization. The virtual machine’s exposure based on the hybrid network topology and security controls in place. Threat intelligence on available and active exploits in the wild. TCVM also gives prescriptive guidance of what action can be taken to prevent exploitation and how urgently that action should be performed. IT teams are tasked with launching new services and applications on a daily basis. By leveraging cloud architecture, they can achieve that in minutes – opposed to days of work. The challenge is that this leads to a fluid security situation where assets (virtual machines) can be assigned to the wrong security group, resulting in immediate exposure. Security teams need to be on the top of this, and the only way to do it is to have global visibility and management across all your networks – Ravid Circus Skybox VP of Products

Read More

Quantum StorNext 6

New StorNext 6 delivers unparalleled combination of high performance and advanced data management Quantum Corp. announced StorNext 6, a major new release of its award-winning StorNext® file system. StorNext 6 adds new advanced data management features to the industry-leading streaming performance of the company’s scale-out tiered storage solutions portfolio, which includes flash, disk, tape, and public and private cloud storage offerings. With this unique combination of performance and management, Quantum continues to extend the power of StorNext in not only media and entertainment but also other highly data-intensive environments such as genomics, academic research, video surveillance, oil and gas, and government security. StorNext 6 addresses the needs of enterprises grappling with using their existing NAS solutions to store large, rapidly growing data sets and drive business value from that data. These enterprises are quickly realizing that traditional NAS is too difficult to upgrade, cannot deliver sufficient performance and is unable to handle parallel workloads. StorNext 6 not only overcomes these limitations but also provides more efficient and cost-effective ways to share and access files across geographically distributed teams, to manage and protect archived data, and to audit changes to data throughout its lifecycle. Quantum delivers all these benefits in an integrated, multi-tier storage solution that maintains optimized performance levels, visibility and access at every tier. Multi-site file replication and sharing for easier collaboration In today’s world, the ability of geographically dispersed teams to work together using a common data set for creative, analytical or other purposes can be a key competitive differentiator.StorNext 6 facilitates greater collaboration with the introduction of FlexSync™ and FlexSpace™: FlexSyncis a powerful new capability in StorNext 6 that provides a fast, flexible and simple way to synchronize data between multiple StorNext systems in a highly manageable and automated fashion. It supports one-to-one, one-to-many and many-to-one file replication scenarios and can be configured to operate at almost any level – specific files, specific folders or entire file systems. By leveraging enhancements in file system metadata monitoring, FlexSync recognizes changes instantly and can immediately begin reflecting those changes on another system. This approach avoids the need to lock the file systems to identify changes, reducing synchronization time from hours or days to minutes, or even seconds. As a result, users can also set policies that automatically trigger copies of files so that they are available at multiple sites, enabling different teams to access data quickly and easily whenever it’s needed. In addition, by providing automatic replication across sites, FlexSync offers increased data protection. FlexSpace allows multiple instances of StorNext – and geographically distributed teams – located anywhere in the world to share a single archive repository and easily access the same data set. Users at different sites can store files in the shared archive, as well as browse and pull data from the repository. Shared archive options include both public cloud storage such as Amazon Web Services (AWS), Microsoft Azure or Google Cloud via StorNext’s existing FlexTier™ capability; and private cloud storage based on Quantum’s Lattus® object storage, or through FlexTier, third-party object storage such as NetApp StorageGRID, IBM Cleversafe and Scality RING. Also, as with FlexSync, FlexSpace provides fully automated movement of files according to policies, thereby increasing efficiency and enhancing data protection through off-site storage. Enhanced StorNext Client functionality for greater agility With StorNext 6, Quantum delivers a new quality of service (QoS) feature that empowers users to further tune and optimize performance across all client workstations, and on a machine-by-machine basis. Using QoS to specify the appropriate bandwidth allocation to individual workstations, an administrator can ensure that storage resources are optimized based on business priorities e.g., providing more bandwidth to demanding applications on urgent projects, and adjust quickly as priorities change. StorNext 6 also enables client platforms to browse archive directories that contain offline files – which can number in the hundreds of thousands, or even millions – without having to retrieve the entire directory. As a result, this feature significantly streamlines the archive retrieval process, enabling users to get the files they need more quickly. File copy expiration for increased ROI and file auditing for deeper insights As the foundation for a single, integrated solution that can span a single name space across flash, spinning disk, object storage, tape and the cloud, StorNext has long reduced overall storage costs through tiering. StorNext 6 adds a new copy expiration feature, enabling automated removal of file copies from more expensive storage tiers, thereby freeing up space and increasing the overall return on investment. In addition, when one of several copies of a file is removed from storage, a complementary selectable retrieve function in StorNext 6 allows users to dictate the order of retrieval of the remaining copies. This functionality ensures that the file is retrieved from the most appropriate storage tier based on business or organizational needs. StorNext 6 can also efficiently track changes in files across the data lifecycle and provide reports on who changed a file, when the changes were made, what was changed and whether and to where a file was moved. In addition to providing administrators with greater file management granularity and insight into usage patterns, this new auditing feature helps support compliance requirements.

Read More