O slideshow foi denunciado.
Seu SlideShare está sendo baixado. ×

Every thing about Artificial Intelligence

Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Carregando em…3
×

Confira estes a seguir

1 de 20 Anúncio
Anúncio

Mais Conteúdo rRelacionado

Diapositivos para si (20)

Semelhante a Every thing about Artificial Intelligence (20)

Anúncio

Mais recentes (20)

Every thing about Artificial Intelligence

  1. 1. Artificial Intelligence Malicious Use System BY: VAIBHAV MISHRA KRITI LABHANIYA TRUBA INSTITUTE OF ENGINEERING AND INFORMATION TECHNOLOGY BHOPAL, MP
  2. 2. Content Definitions Capabilities The Problems Solutions Some More Solutions Results
  3. 3. What is Artificial Intelligence? Artificial Intelligence is the study of mental faculties through the use of computational models. Artificial Intelligence is a way of making a computer, a computer- controlled robot, or a software think intelligently, in the similar manner the intelligent humans think. Artificial Intelligence is the best field for dreamers to play around. It must be evolved from the thought that making a human-machine is possible.
  4. 4. Artificial Intelligence Capabilities
  5. 5. Why AI is Important? • AI automates repetitive learning and discovery through data. • AI adds intelligence to existing products. • AI analyzes more and deeper data using neural networks that have many hidden layers. • AI achieves incredible accuracy though deep neural networks – which was previously impossible.
  6. 6. How AI works? AI works by combining large amounts of data with fast, iterative processing and intelligent algorithms, allowing the software to learn automatically from patterns or features in the data. AI is a broad field of study that includes many theories, methods and technologies, as well as the following major subfields: Machine Learning Neural Network Deep Learning Cognitive Computing Computer Vision Neural Language Processing (NLP)
  7. 7. Machine Learning Machine Learning automates analytical model building. It uses methods from neural networks, statistics, operations research and physics to find hidden insights in data without explicitly being programmed for where to look or what to conclude.
  8. 8. Neural Network In information technology (IT), a neural network is a system of hardware and/or software patterned after the operation of neurons in the human brain. Neural networks -- also called artificial neural networks -- are a variety of deep learning technology, which also falls under the umbrella of artificial intelligence, or AI.
  9. 9. Deep Learning Deep learning (also known as deep structured learning or hierarchical learning) is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. Learning can be supervised, semi- supervised or unsupervised.
  10. 10. Cognitive Computing In general, the term cognitive computing has been used to refer to new hardware and/or software that mimics the functioning of the human brain and helps to improve human decision-making. In this sense, CC is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. CC applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, CC hardware and applications strive to be more affective and more influential by design.
  11. 11. Computer Vision Computer vision is an interdisciplinary field that deals with how computers can be made for gaining high-level understanding from digital images or videos. From the perspective of engineering, it seeks to automate tasks that the human visual system can do.
  12. 12. Neural Language Processing Natural language processing (NLP) is an area of computer science and artificial intelligence concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data.
  13. 13. The Problems Cybersecurity Vulnerability Cybersecurity, computer security or IT security is the protection of computer systems from the theft and damage to their hardware, software or information, as well as from disruption or misdirection of the services they provide. Vulnerability is a cyber-security term that refers to a flaw in a system that can leave it open to attack. A vulnerability may also refer to any type of weakness in a computer system itself, in a set of procedures, or in anything that leaves information security exposed to a threat.
  14. 14. Solutions  Policy makers should work closely to researchers to investigate, prevent and mitigate risks malicious uses of AI.  Researchers and engineers in artificial intelligence should take the dual-use nature of their work, allowing misuse considerations to influence research priorities and standards, and to proactively reach interested party’s harmful applications are predictable.
  15. 15. Solutions  Best practices should be identified in research areas with more than proven methods to solve dual-use of problems, such as computer security in the of case AI.  Actively seek to expand the range of stakeholders and domain experts involved in discussions on these challenges.
  16. 16. Other-Solutions  Learn from and with the cybersecurity community  Explore different opening models (Dual Purpose nature of AI and ML)  Promote a culture of responsibility  Develop technological and political solutions
  17. 17. Results Data Protection Data protection is the process of safeguarding important information from corruption, compromise or loss. The importance of data protection increases as the amount of data created and stored continues to grow at unprecedented rates.
  18. 18. Results Vulnerability can be stopped Vulnerability is the state of being exposed to the possibility of being attacking or harming the machine. Recently a processor vulnerability occurred in Intel, AMD and ARM CPUs and as we know most of the processors are from these companies. So, if vulnerability occurs again it will be a global technical crisis.
  19. 19. Thank you for your Patience

Notas do Editor

  • Computer vision is an interdisciplinary field that deals with how computers can be made for gaining high-level understanding from digital images or videos. From the perspective of engineering, it seeks to automate tasks that the human visual system can do. "Computer vision is concerned with the automatic extraction, analysis and understanding of useful information from a single image or a sequence of images. It involves the development of a theoretical and algorithmic basis to achieve automatic visual understanding." As a scientific discipline, computer vision is concerned with the theory behind artificial systems that extract information from images. The image data can take many forms, such as video sequences, views from multiple cameras, or multi-dimensional data from a medical scanner. As a technological discipline, computer vision seeks to apply its theories and models for the construction of computer vision systems.

×