Call for Abstract
8th Global Summit on Artificial Intelligence and Neural Networks, will be organized around the theme “Harnessing the power of Artificial Intelligence”
Neural Networks 2020 is comprised of keynote and speakers sessions on latest cutting edge research designed to offer comprehensive global discussions that address current issues in Neural Networks 2020
Submit your abstract to any of the mentioned tracks.
Register now for the conference by choosing an appropriate package suitable to you.
Artificial intelligence is a performance based-system idea in a robot. Artificial Intelligence brings creative way to the robot to be able to render assistance to humans in changeable and dynamic environments, such as homes, hospitals, the workplace, and all around us. It is a process of making a a computer operated robot or a software think logically, in a comparable manner the creative humans imagine. Artificial intelligence is performed by analysing how human brain conceives and how humans learn, decide, and work while seeking to solve a query, and then using the results of this study as a basis of progressing intelligent software and systems. In the actual world, knowledge has some unwelcomed assets.
AI algorithms can tackle learning, perception, problem-solving, language-understanding and/or logical reasoning. In modern world AI can be used in many ways even when it is to control robots. Sensors, actuators and non-AI programming are parts of larger robotic system.
- Track 1-1Artificial Narrow Intelligence
- Track 1-2Robotics
- Track 1-3 Computer Science and Technology
- Track 1-4Information Technology
- Track 1-5Recurrent Neural Networks And Reservoir Computing
- Track 1-6Artificial Super Intelligence
Cognitive Computing refers to the hardware and/or software that helps to improve human decision-making and mimics the functioning of the human brain. It refers to systems that can learn at scale, reason with purpose and interact with humans naturally. It comprises of software libraries and machine learning algorithms for extracting information and knowledge from unstructured data sources. The main is to accurate models of how the human brain/mind senses, reasons, and responds to stimulus. High performance computing infrastructure is powered by processors like multicore CPUs, GPUs, TPUs, and neuromorphic chips. They interact easily with users, mobile computing and cloud computing services so that those users can define their needs comfortably.
Neural Informatics for Cognitive Computing
Neural Information theory is a multidisciplinary enquiry of the physiological and biological representation of knowledge and information in the brain at the neuron level.
- Track 2-1AI And Signal Processing
- Track 2-2Machine Learning Algorithms
- Track 2-3Big Data and Cognitive Computing
- Track 2-4Speech Recognition & Face Detection
- Track 2-5Cognitive Assistant for Visually Impaired
- Track 2-6Natural Language Interaction
Machine learning is a part of artificial intelligence based on the idea that systems can learn from data, make decisions and identify designs with insignificant human intervention. Machine learning is a method for making a personal computer, a PC controlled robot, or a product think smartly, and within the comparative way the perceptive people think. They are normally grouped by either learning style or by comparison in method or function. It simplifies the continuous advancement of scheming through introduction to new scenarios, testing and adaptation, while employing pattern and trend detection for improved decisions in succeeding situations. ML gives possible arrangements in every one of these areas, and is set to be a support of our future progress.
- Track 3-1Deep Learning
- Track 3-2Natural Language Processing
- Track 3-3Artificial Intelligence
Artificial Neural Networks is a computational model based on the structure and functions of biological neural networks. ANNs are considered nonlinear statistical data modelling tools where the complex relationships between inputs and outputs are modelled or patterns are found.
Modern Digital Computers outperform humans in the domain of numeric computation and related symbol manipulation. However, humans can effortlessly solve complex perpetual problems (like recognizing a man in a crowd from a mere glimpse of his face) at such a high speed and extend as to dwarf the world’s fastest computer. The biological neural system architecture is completely different. This difference significantly affects the type of functions each computational model can best perform.
Numerous efforts to develop intelligent programs based on von Neumann’s centralized architecture have not resulted in general-purpose intelligent programs. Inspired by biological neural networks, ANNs are massively parallel computing systems consisting of an extremely large number of simple processors with many interconnections. ANN models attempt to use some organizational principles believed to be used in the human.
- Track 4-1Speech Recognition
- Track 4-2Artificial Neurons
- Track 4-3Machine Translation
- Track 4-4Advances in Artificial Neural Systems
- Track 4-5Optimization for training deep neural network models
- Track 4-6Autoencoders
- Track 4-7Deep Feed forward Networks
- Track 4-8Adaptive Neuro Fuzzy Interface System
- Track 4-9ANN Controller for automatic ship berthing
Ambient intelligence (AmI) deals with the computing devices, where physical environments interact intelligently and conservatively with people. These environments should be aware of people's needs, customizing requirements and forecasting behaviours. It can be diverse, such as homes, meeting rooms, offices, hospitals, schools, control centres, vehicles, etc. Artificial Intelligence research aims to include more intelligence in AmI environments, allowing better support for humans and access to the essential knowledge for making better decisions when interacting with these environments.
- Track 5-1Pattern Recognition
- Track 5-2Facial Recognition
- Track 5-3Multilayer Perceptrons and Kemel Networks
- Track 5-4Probability matching in Perceptrons
- Track 5-5Smart homes and media convergence
- Track 5-6Interactive machine learning
- Track 5-7Context-driven processing and inference
- Track 5-8Data Fusion between Physical and Digital Worlds
Perceptron is a machine learning algorithm that helps to provide classified outcomes for computing. It is a kind of a single-layer artificial network with only one neuron and a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector.
Multilayer Perceptron is a class of feed forward artificial neural networks. And the layered feed forward networks are trained by using the static back-propagation training algorithm. For designing and training an MLP perceptron several issues are involved:
- Track 6-1Number of hidden layers is selected to use in the neural network
- Track 6-2A solution that avoids local minima is globally searched
- Track 6-3Neural networks are validated to test for overfitting
- Track 6-4Converging to an optimal solution in a reasonable period of time.
- Track 6-5Back propagation algorithm for the on-line training of Multilayer Perceptrons
- Track 6-6Comparison to Probability
- Track 6-7Compensatory Fuzzy Logic
- Track 6-8Multilayer Perceptron Neural Network for flow prediction
Mechatronics is combination or junction of Electrical, Mechanical, and Computer Science Engineering. Mechatronics is the closest to robotics with the slight and main difference in mechatronics systems inputs are provided whereas in robotics systems it acquires the inputs by their own.
A Mechanical Engineer is mostly responsible for the Mechanical body parts, Electrical/Computer Engineer for the Electrical aspect and Computer Engineer for Programming. And the Mechatronics Engineer must be pretty much qualified for every partition we discussed above.
Robotics is a very broad term; we must go through different application to understand it better.
- Track 7-1Micro Robots
- Track 7-2Robots in Defense
- Track 7-3Military Robots
- Track 7-4Automation and Manufacturing
- Track 7-5Industrial Robot Automation
- Track 7-6Medical Robots
- Track 8-1Advanced NLG Systems
- Track 8-2NLP for Understanding Semantic Analysis
- Track 8-3Approaches of NLP for Linguistic Analysis
- Track 8-4Natural Language Understanding and Interpretation
- Track 8-5NLP for Advanced Text Analysis
Cloud computing is branch of information technology which grants universal access to shared pools of virtualised computer resources. A cloud can host different workloads, allows workloads to be scaled/deployed-out on-demand by rapid provisioning of physical or virtual machines, self-recovering, supports redundant, and highly-scalable programming models and allows workloads to recover from hardware/software rebalance and failures allocations.
Artificial Intelligence technology plays a very important role in Making resources available, Distribution transparency and Openness Scalability especially for Cloud Computing Application. Artificial intelligence and cloud computing will have an important impact on the development of information technology by mutually collaborating.
- Track 9-1Secure data management within and across data centers
- Track 9-2Software and data segregation security
- Track 9-3Integrity assurance for data outsourcing
- Track 9-4Cloud Cryptography
- Track 9-5Cloud access control and key management
Autonomous robots are the intelligently capable machines which can perform the task under the control of a computer program. They are independent of any human controller and can act on their own. The basic idea is to program the robot to respond a certain way to outside stimuli. The combined study of neuroscience, robotics, and artificial intelligence is called neurorobotics.
- Track 10-1Formation control for Autonomous Robots
- Track 10-2Robotics and Computer-Integrated Manufacturing
- Track 10-3Hybridization of Swarm Intelligence techniques, with applications to robotics or autonomous complex systems
- Track 10-4Advances in Autonomous Mini Robots
- Track 10-5Soft Computing for Robotics
Parallel Processing reduces processing time by simultaneously breaking up and running program tasks on multiple microprocessors. There are more engines (CPUs) running, which makes the program run faster. It is particularly useful when running programs that perform complex computations, and it provides a viable option to the quest for cheaper computing alternatives. Supercomputers commonly have hundreds of thousands of microprocessors for this purpose. Parallel programming is an evolution of serial computing where the jobs are broken into discrete parts that can be executed concurrently. It is further broken down to a series of instructions and the instructions from each part execute simultaneously on different CPUs.
- Track 11-1Super computing / High Performance Computing (HPC)
- Track 11-2General purpose computing on graphics processing units
- Track 11-3Re-optimizing Data-Parallel Computing
- Track 11-4Adaptive Sequential Posterior Simulators
ethics of artificial intelligence is the part of the ethics of technology specific to robots and other artificially intelligent beings. It is typically divided into robo ethics, a concern with the moral behaviour of humans as they design, construct, use and treat artificially intelligent beings, and machine ethics, which is concerned with the moral behaviour of artificial moral agents (AMAs).
The term "robot ethics" (robo ethics) refers to the morality of how humans design, construct, use and treat robots and other artificially intelligent beings. It considers both how artificially intelligent beings may be used to harm humans and how they may be used to benefit humans.
- Track 12-1Risk to human arrogance
- Track 12-2Transparency, responsibility, and open source
- Track 12-3Robot moralities
Bioinformatics is a multidisciplinary research field that combines computer science, biology, science, statistics and mathematics in to a broad-based field that will have profound impacts on all fields of biology. It is the application of computer technology to the management of biological information.
Biocomputing is the computing which designs and constructs the computer containing biological components.
- Track 13-1Computational Evolutionary Biology
- Track 13-2DNA Sequencing with Artificial Intelligence
- Track 13-3Intelligent Systems in Bioinformatics
- Track 13-4Genetics and Genomics
- Track 13-5Prediction of Protein Analysis
Ubiquitous computing is a branch of computing in computer science and software engineering where computing is made easier so that they can appear anytime and everywhere. It can occur using any device, in any location, and in any format.
Key features include:
- Use of Inexpensive processors which reduces the storage and memory requirements.
- Totally connected and constantly available computing devices and capturing of real time attributes.
- Focus on many-to-many relationships, instead of one-to-one, many-to-one or one-to-many in the environment, along with the idea of technology, which is constantly present.
- Relies on wireless technology, converging Internet and advanced electronics.
- Track 14-1System support infrastructures and services
- Track 14-2Middleware services and agent technologies
- Track 14-3User interfaces and interaction models
- Track 14-4Wireless/mobile service management and delivery
- Track 14-5Interoperability and wide scale deployment
- Track 14-6Wearable computers and technologies