Open Access Review Article

The Future of Health and Healthcare in a World of Artificial Intelligence

Upendra K Kar1* and Rupa Dash2

1Division of Radiation Health, Department of Pharmaceutical Sciences, Winthrop P. Rockefeller Cancer Institute, University of Arkansas for Medical Sciences, Little Rock, Arkansas, USA

2Dash Global Media Corp, Los Angeles, CA, USA

Corresponding Author

Received Date: August 13, 2018;  Published Date: September 26, 2018

Abstract

Artificial Intelligence (AI) consists a set of computational technologies that designed to sense, learn, reason, and take action. AI has already been integrated in many applications including automating the business processes, gaining insight through data analysis, and engaging with customers and employees. AI enabled machines using deep learning algorithms has been shown to exceed human performance, particularly in visual tasks like playing Atari games, strategic board games like Go and object recognition etc. Furthermore, with the technological advancement in mobile computing, artificial neural networks, robotics, storage of huge data in internet, cloud-based machine learning and information processing algorithms etc. application of AI has been integrated in many sectors including transportation, service robots, health care, education, low-resource communities, public safety and security, employment and workplace, and entertainment etc. In this review, we have highlighted the recent trends and applications of AI in healthcare i.e. mining medical records (EHR), designing treatment plans, robotics mediated surgeries, medical management and supporting hospital operations etc. There are many AI platform has been designed which has also been proved to be instrumental in the development of innovative drugs. There is no doubt that AI will enhance the cooperation between humans and machines in years to come and will play an integral role in the improvement of the health index and quality of life.

Introduction

It is universally accepted that we have practiced preventive health care for many centuries. However, to cope with the new adversities and challenges like various cancers and epidemics, we have to move towards predictive and personalized medicine. In recent years, the development of new technologies and opportunities like large-scale analysis of the genome, transcriptome, proteome, metabolome, and microbiome etc. has given us the hope to progressing faster to practice personalized medicine (1). Using big data sets (generated from these omics analysis) as the backbone, we can add the universe of clinical data (generated over few centuries) to design better and effective healthcare system.

So far Moore’s Law i.e. transistors in a dense integrated circuit doubles about every two years, holds its value (3). The architecture of Complementary metal–oxide–semiconductor (CMOS) have become lighter, faster, cheaper, and more powerful roughly every eighteen to twenty-four months for the past five decades. Though, according to few experts by 2021 “transistors could get to a point where they could shrink no further.” Silicon chips have shrunk to fourteen nanometers, and Intel is already on its way to reduce it to seven-nanometer chips. These technological advancement has transformed the computing power which allowed us to build computers that are not only more powerful but lighter, faster, cheaper, and consume 15-300 times less energy than the best CMOS chips used only a decade ago (4).

Artificial intelligence has been defined as the development of computers to engage in humanlike thought processes, such as learning, reasoning and self-correction (5). In recent years AI been described as a ‘broad suite of technologies that can match or surpass human capabilities, particularly those involving cognition (3). Though these technologies are not new rather a couple of decades old, in the last 15 to 20 years it has changed our imagination (6). AI services, such as Siri, Alexa, self-driving cars, implantable medical devices, electronic trading, and automation using robots are few of the examples. Algorithms like machine learning, deep learning etc. drive these computer’s “thoughts” by providing a conceptual framework for processing input and making decisions based on that data (6). The modern machines enabled with AI platform (for example Nvidia’s DGX-1) has the ability to accept information about the problem from its surroundings, generate a list of probable actions that it could take, and maximize its chance of achieving its goals by using logic and probability to choose the activities with the highest likelihood of success (3). These machines are made to learn and act intelligently based on the big-data sets and recognize objects or sounds with considerable precision (4). Using deep learning algorithms, powered by advances in computation these machines have been shown to exceed human performance, particularly in visual tasks like playing Atari games, strategic board games like Go and object recognition (5-10). Lightweight unmanned aerial vehicles enabled with machine-learning techniques has been proven to be revolutionary in monitoring environment. These vehicles not only can access hard-to-reach places but also can capture vegetation types quickly and more efficiently than traditional ground-based methods. Using similar technology fire safety researchers now can obtain real-time images and monitoring data of fire scenes and plan there emergency rescue mission. This has not only been effective in saving the life of the victims but the life of firefighters as well. Similarly; in astronomy, machine learning has been proven to be pivotal particularly in time-domain surveys. AI enabled cameras communicate with each other in every seconds to detect and analyze intriguing objects, which is not feasible using human skills. Machine learning has recently been applied in undergraduate science education to analyze student writing. Using machine learning in the classroom can allow the instructors to understand how their students are thinking about scientific concepts based on their written thoughts and ideas. This is revolutionary, as this will allow instructors to get real time information about the learning and acquiring skills of students.

In healthcare, technologies like Artificial Intelligence (AI), Machine Learning (ML) and Natural Language Processing (NLP) have further encouraged us to accomplish our goals i.e. design data centric and data driven effective personalized medicine (7- 10). It has shown tremendous potential starting from mining medical records (EHR), designing treatment plans, robotics mediated surgeries, medical management and supporting hospital operations, interpret clinical data, clinical trial participation, automated image diagnosis, preliminary diagnosis, virtual nursing, connected healthcare devices etc. There are many AI platform has also been designed for development of innovative drugs. There is no doubt that AI will enhance the cooperation between humans and machines it will play an integral role in the improvement of the health index and quality of life.

Technical Jargons Used in AI

Natural language processing

Natural language processing (NLP) is a field of computer science, artificial intelligence, and computational linguistics concerned with the interactions between computers and human (natural) languages (11). It is the collection of techniques employed to help aiding human-human communication, such as in machine translation (MT); aiding human-machine communication, such as with conversational agents; or benefiting both humans and machines by analyzing and learning from the enormous quantity of human language content that is now available online (11). It is been increasing used in real-world applications for increased interactivity and productivity like creating spoken dialogue systems and speech-to-speech translation engines, mining social media for information about health or finance, and identifying sentiment and emotion toward products and services etc. (11).

Machine learning

It gives “computers the ability to learn without being explicitly programmed.” Machine learning explores the study and construction of algorithms that can learn from experience (E) with respect to some task (T) and some performance measure (P), if its performance on T, as measured by P, improves with experience E then the program is called a machine learning program (12-14). A variety of ML algorithms, including Support Vector Machine (SVM), maximum likelihood (ML), Neural Network (NN), k-Nearest Neighbor (kNN), decision trees, Bayesian networks etc. have been developed. These techniques help in making predictions or decisions, through building a model from sample inputs (12-14).

Deep learning

Deep learning is part of a broader family of machine learning methods based on learning data representations, as opposed to task specific algorithms (15). These algorithms try to learn multiple levels of abstraction, representation and information automatically from large set of data. Deep learning (DL) techniques includes Auto encoder neural network, Deep Stack network (DSN), Deep Belief Network (DBN) Long Short term Memory (LSTM), Extreme Learning Model (ELM), Generative Adversarial Networks (GANs) etc. These techniques has made significant inroads into many areas including speech recognition, text recognition, lips reading, computer-aided diagnosis, face recognition, drug discovery etc. (15).

Reinforcement learning

It is an area of machine learning inspired by behaviorist psychology, concerned with how software agents ought to take actions in an environment so as to maximize some notion of cumulative reward (9,16). This shifts the focus of machine learning from pattern recognition to experience-driven sequential decisionmaking. This has a great promise considering loads of data already available in the internet (16).

Crowdsourcing and human computation

Crowdsourcing” and “human computation” refer to various ways that people and computing have been brought together to achieve outcomes that were previously beyond our individual capabilities or expectations. They solve the problems that beyond the capabilities of artificial intelligence algorithms. Google’s search algorithms, Wikipedia’s millions of articles, Amazon’s recommendations, and the success of Linux and other open source software projects are examples of ways in which technology and people together have exceeded the capabilities of people or machines in isolation (17). Another example of such a human computation application is re CAPTCHA, which leverages millions of users who solve CAPTCHAs every day to correct words in books that optical character recognition (OCR) programs fail to recognize with certainty (17).

Computer vision

It is an interdisciplinary field that deals with how computers can be made for gaining high-level understanding from digital images or videos. Using CV a machine processes raw visual input like and a JPEG file or a camera feed and understand what it’s seeing. It is a deep learning method that enable computers to perform some vision tasks better than people. The best examples are bar code scanner i.e. it’s ability to “see” the stripes in a UPC, or the face scanning devices like apple phone etc.

Algorithmic game theory and computational social choice

Game theory is the mathematical study of strategic behavior in interactive decision making environments, in which the utility of each agent not only depends on his own decisions but also on those of other agents. A central concern of game theory is the development and analysis of solution concepts, such as Nash equilibrium or the core, which provide answers to such diverse questions as to which actions rational agents and groups of agents can be expected to choose, which coalitions are likely to form, and how the earnings of the collaboration are to be divided. Social choice theory concerns the formal analysis and design of methods for aggregating the preferences of multiple agents and has many theoretical and real-life applications such as voting, resource allocation, coalition formation, and ranking objects (6,19).

Neuromorphic computing

The field is highly interdisciplinary, combining ideas and methods from psychology, computer science, linguistics, philosophy and neuroscience. It is to dissect the nature of human knowledge i.e. its forms, content and how knowledge acquired, processed and utilized. Since its inception by Carver Mead, in the late 1980s, use of very-large-scale integration (VLSI) systems containing electronic analog circuits to mimic neuro-biological architectures present in the nervous system (6, 18), the research has extended to language, memory, visual perception and cognition, thinking and reasoning, social cognition, decision making and cognitive development etc. Technology giants including Intel, IBM, and Qualcomm are now engaged in a race to develop the neuromorphic computers.

Robotics

Is the interdisciplinary branch of engineering and science that includes mechanical engineering, electrical engineering, computer science, and others. Robotics deals with the design, construction, operation, and use of robots, as well as computer systems for their control, sensory feedback, and information processing (6).

Internet of things (IoT)

It is a metaphor that encapsulates almost anything and everything in the communications space. It covers a wide array of features including the extension of the internet, the web as physical realm, deployment of extensive embedded distributed devices, sending and the actuation abilities etc. The term IoT is also called the future internet. According to the IoT European Research Cluster (IERC), IoT is a multidimensional and multi-faceted paradigm. It can connect “fixed” and mobile “things” with various levels of “intelligence” and sensing/actuating, interpreting, communicating and processing information and exchanging knowledge over the hyper connected IoT space by using different types of platforms (6,19-22). According to the 2020 conceptual framework IoT is expressed through a simple formula such as: - IoT= Services+ Data+ Networks + Sensors (6,19-22). Radio Frequency Identification techniques (RFID) and related technologies plays a vital role in IOT (6, 19-23). Using IoT a wide array of devices, including appliances, vehicles, buildings, and cameras, gets interconnected to collect and share their abundant sensory information for healthcare purposes.

History of AI

The field of Artificial Intelligence (AI) came to existence in 1956, in a workshop organized by John McCarthy (24). The original goal was to find out ways how machines can be designed to simulate aspects of intelligence. Later it was the pioneer work of McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon, Arthur Samuel, Oliver Selfridge, Ray Solomonoff, Allen Newell, and Herbert Simon etc. galvanized the field of “artificial intelligence” (6,25-26). Alan Turing in his article “Computing Machinery and Intelligence” proposed the possibility of computers created for simulating intelligence and how intelligence might be tested, and how machines might automatically learn (25). Later it was Newell and Simon pioneers of heuristic research, developed mathematical theorems i.e. Logic Theorist program (26). Selfridge and colleagues developed complex applications such as face recognition (27). Using natural language processing “Shakey”, a wheeled robot was built at SRI International gained lot of attention in the field of mobile robotics (28). Later self-play checkers-playing program based on machine learning system was developed. However, in late 90’s technological progress allowed to build data driven systems. Furthermore, with availability of computing power to store and process large dataset internet having the capacity to gather large amounts of data, and statistical techniques that, by design, can derive solutions from these data sets, allowed AI to emerge as one of the powerful technologies of the century (6,29). AI is now being used in many sectors including entertainment, transportation, service, healthcare, education, reaching out to low-resource communities, public safety and security, employment and workplace etc.

Application of AI in Healthcare

Application of AI in Healthcare Theoretically, application of AI can be extended to almost every aspect of healthcare. According to Rock Health one of the leading venture capital firm, 121 health AI and machine learning companies raised $2.7 billion between 2011 and 2017 (30). Technological advancement has allowed us to build the modern medical equipment by integrating AI technology, which helped us in clinical decision support, patient monitoring and coaching, automated devices to assist in surgery or patient care, and management of healthcare systems etc. (3). Particularly, in personalized medicine i.e. diagnosis and treatment AI has been proved to be a great support to the medical professionals in clinic. For example; machine learning has the capabilities to analyze scientific literatures, create patterns to predict the diagnosis. Similarly; it can analyze social media data to predict patient health risk.

Over the years robotics has been demonstrated to have the capabilities to assist the surgeons in complicated surgeries (30-33). When Robodoc, developed and demonstrated their robotic systems for orthopedic surgeries i.e. hip and knee replacements, it was an over joyed moment. Since then use of robotics in various surgeries has been exploded. The Da Vinci robotic system provides a three-dimensional view, hand-tremor filtering, fine dexterity and motion scaling, and is suitable for narrow, inaccessible operative areas. The da Vinci system, developed by Intuitive Surgical for minimally invasive heart bypass surgery became a success story which later extended to other procedures including prostate cancer (33). It is now in its fourth generation and now considered to be the standard of care in multiple laparoscopic procedures. It is the most common surgical robot and various modifications have been made for utilization in different types of surgeries including prostate, gynecologic and cardiothoracic cases, Orthopaedic, gall bladder, hernia surgery, cancer and colon etc. This platform is considered as a revolution not because it is widely used in various procedures, rather serves as a data platform for studying the process of surgery (30-33). Robots also have shown great potential to facilitate other therapies including autism. Rudovic et al recently showed the efficacy of a personalized machine learning (ML) robot for autism therapy (34).

Automated image interpretation has been proposed for many decades. Most of the modern medical imaging modalities i.e. CT, MR, ultrasound etc. generate digital images, therefore large-scale machine learning techniques could be applied to medical image data (2). Interestingly; few tertiary care health centers have the privilege of having as millions of patient scans and the associated radiological report with their health record. AI is the definitive next step to analyze and predict the disease outcome. In recent years, papers are appearing in the literature showing that AI radiological findings are highly reliable. Many small and large corporations including, Google, Dell, Hewlett-Packard, Apple, Hitachi, Microsoft, Amazon, IBM etc has been actively pursuing on how to transform healthcare using AI. Google’s Deep Mind Health project, is been used to mine medical records in order to provide better and faster health services. Using Google Net Inception v3 CNN architecture with a dataset of 129,450 clinical images, Esteva et al has demonstrated that artificial intelligence is capable of classifying skin cancer with a level of competence comparable to dermatologists (8). Considering its success, Alphabet is working on its genetic data-collecting initiative which will be transformative in healthcare. In a recent report Vandenberghe et al demonstrated how AI can supplement the pathologist expertise to ensure accuracy in diagnosis of breast cancer (35). They used two machine learning models i.e. Support Vector Machines (SVM) and Random Forest (RF) and their result showed a staggering concordance i.e. 83% with the pathologist. This study validates deep learning aided diagnosis can facilitate clinical decision making in breast cancer by identifying cases at high risk of misdiagnosis (35).

Enlitic has developed an AI platform which uses the power of deep learning technologies to analyze radiology images and compare with other data and interprets information. Another start-up Arterys has developed Vios Works partnering with GE Healthcare to reform cardiac MRI. Arterys’s platform is designed to acquire seven dimensions of data, which include 3D heart anatomy,blood flow rate, and blood flow direction. More importantly, the scanning process takes 6-to-10 minutes in comparison to an hour, and patients do not need to hold breath during the examination. Butterfly Network using AI created handheld medical imaging device which can make MRI and ultrasounds significantly cheaper and more efficient. Bay Labs and some collaborators has taken the technology to Africa to help identify symptoms of Rheumatic Heart Disease (RHD) in Kenyan school children (36). Their platform can analyze data derived from an ultrasound and predicts whether it’s seeing something consistent with RHD. 3Scan developed platform can undertake a tissue sample analysis in one day that it would take a pathologist to do in one year using traditional methods (36). This particularly an immense help to pathology laboratories and researchers as it allows them to have a better view about tissues (30).

Similarly, AI driven tools like IBM Watson Health (IWH) is the future. Few years back IBM Watson launched a project called Watson Paths in collaboration with the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University. Watson Paths consists of two cognitive computing technologies run by AI algorithm (36). Watson has millions of pages of academic literature and other healthcare data, which make the system capable of offering a series of suggestions along with confidence intervals that show how applicable the course of action may be (3). IWH can read structured and unstructured information in a cardiologist’s medical report and combine with established data (from previous dataset) to determine whether the diagnosis is right or not (1, 30). Panahiazar et al compared the performance of Seattle Heart Failure Model (SHFM) with their risk prediction model (based on machine learning techniques) on using routine clinical care (HER) data from Mayo Clinic. Their model showed more accurate (11% improvement in AUC) than the conventional model. Furthermore, their study demonstrated that addition of new predictive markers (such as co-morbidities) into their model can significantly (8% improvement in AUC) improve prognostic performance (37). In a recent study, Weng et al compared use of the American College of Cardiology (ACC/AHA) guidelines with four machine-learning algorithms i.e. random forest, logistic regression, gradient boosting, and neural networks. Using the four techniques they analyzed electronic medical records of 378,256 patients available in 2005, and accurately predicted their cardiovascular events over the next 10 years (38). Most importantly; as the machine-learning methods can take account of 31 more data points like ethnicity, arthritis, and kidney disease etc these AI methods performed significantly better than the conventional ACC/AHA guidelines. The AUC (which a score of 1.0 signifies 100% accuracy), of ACC/AHA guidelines was 0.728 in comparison to the four AI methods which ranged from 0.745 to 0.764. Their study further revealed neural networks can correctly predict 7.6% more events than the ACC/AHA method, and it raised 1.6% fewer false alarms. In terms of clinical correlation, these test could have saved 355 additional patients. Weng et al suggests inclusion of additional parameters like lifestyle and genetic factors in the computer algorithms can further improve the accuracy (38). The AI driven tools like IBM Watson Health (IWH) can be applied to many other conditions in eye, lung, brain, breast etc.. The higher the number, the more certain Watson is that a particular drug, therapy, or diagnosis is the way to go (2).

AI has also been well integrated in to other areas of clinical practice. For example using “TeleLanguage” clinicians are now conducting multiple therapy sessions in multiple location simultaneously. The mobile computing revolution has galvanized the application of AI in many other areas of health care including in Geriatrics, Psychiatrics and Neurology. For example; psychiatrists now use AI program i.e. Lifegraph, to detect early signs of distressful behavior in patients. This application collects data from a patient’s smartphone and analyze their behavioral patterns and predicts and alerts the clinicians about their patients (36). Now there are at least couple of thousands mobile applications used in different areas of healthcare. Collecting biometrics from motion tracking devices (wearable devices) are widely used across the globe. Furthermore with advancement in “internet of things” the entire home environment is now connected with health monitoring devices (3). This has transformed the field of geriatrics. Now with various Mobile applications, both the elderly population and their doctors can monitors the movement and activities. This allow them to stay at home and not to visit the hospital frequently. There are many start-ups are now actively engaged in designing and developing devices which can mitigate the effects of hearing and vision loss. There is an active interest to design smart assistive devices (i.e. walkers, wheel chairs etc.) which will assist them in extending their range of activities (3).

A small start-up from Chicago i.e. Careskore has smartly used the cloud computing technology in Healthcare analytics. Through its Zeus algorithm it can predicts how likely a patient will be readmitted to a hospital in real-time. The prediction is based on combination of clinical, labs, demographic and behavioral data. Another company i.e. Cloud Med X develops algorithms, machine learning and natural language processing which will generate realtime clinical insights at all points of care, which can be used to improve patient outcomes (36). This is really valuable for hospital administration as these kind of data will help to improve the quality of care, while patients could also get a clearer picture about their health. Sentrian has developed algorithm which can smartly predicts whether the patients will be sick or not even before they experience symptoms. This is particularly helpful to tackle chronic diseases and remote patient monitoring. Radiation therapy has been proved to be ideal for treating various cancers, there are many underline issues linked with radiation therapy i.e. Radiation associated complexity (36). Oncora Medical has developed an AI platform which can help doctors design sound radiation treatment plans for patients (36).

AI has also been proved to play a major role in analysis of big data sets in genomics, which is important in drug discovery (39). Moreover, advanced medical care for diseases like various cancers is heavily dependent on genotype-phenotype connections. After the completion of human genome sequencing project, the genome sequencing is becoming routine for each and every patient. It is merely impossible for humans to analyze the data quickly and comprehensively without the help of computers. Therefore, integration of AI in these computers will enable them to analyze these data sets quickly and comprehensively and predict clinical conditions (39-40). More importantly, as computers don’t forget what they have learned and they don’t have inherent biases, they are more likely to produce objective diagnoses which is key in advanced medical care (2).

The recent AI platform has also taken a big stride in healthcare research. For example; platform developed by Atomwise can predict, whether a particular drug will work or not (36). Similarly; Recursion Pharmaceuticals build a proprietary drug discovery platform that complements well with the high-throughput biology. Using their platform novel drugs can be identified for rare genetic diseases (36). Deep Genomics has developed the technology to predict the effects of a particular mutation in the body based on its analyses of hundreds of thousands other mutations. Turbine has used AI to model cell biology to design personalized treatments for any cancer type or patient faster than any traditional healthcare service. As a result, the technology is already used in collaborations with Bayer, the University of Cambridge and top Hungarian research groups to find new cancer cures, speed up the time to market, and save the lives of patients suffering from currently incurable forms of the lethal disease (30). Another small company i.e. Zephyr Health uses machine learning to help life science companies to improve research and reduce the time it takes to bring their therapies to market (30). In recent years, Microsoft is also actively engaged to apply advanced machine learning algorithms to the mysteries of human biology (34).

Now we are exposed to the benefits of Neuromorphic computing which offers a practical approach to solve problems not addressable with classical algorithmic programming (17). The Neurogrid, BrainScaleS, True North and SpiNNakeris systems are designed to mimic biological neural networks to improve the hardware efficiency and robustness of computing systems. The digital equivalents of neurons, neuromorphics has the ability to communicate in parallel (and without the rigidity of clocked time) using “spikes”-bursts of electric current sent according to the need. Like our brains, these chip’s process incoming flows of electricity and communicate to each neurons. Using these technologies robots will be able to function like a human i.e. respond to spoken commands, recognize their surroundings, and navigate independently. Stanford’s Brains in Silicon lab i.e. Brainstorm can enhance brain-machine interfaces so that an algorithm can identify the neural spikes in a paralyzed person’s brain and then it will direct the prosthetic arm to execute the task. IBM Neuromorphic chips are already in testing phase where the sensors that will identify objects and then use spoken words to help the blind to have a better life.

Future Scope and Challenges

As we highlighted, in last few years, application of artificial intelligence has shown remarkable promise and has the potential to radically transform healthcare. Technologies like “deep learning” has proven to work efficiently in many areas including audio and video sensing, object and activity recognition and natural perception etc. However, there are many hurdles yet to be crossed before we get the full potential of AI. The first and foremost is to gain confidence of clinicians, nurses, patients, and those involved in the regulatory and policy making boards. As we discussed earlier robots are gradually being adopted to perform complicated surgery, including minimally invasive and surgeon-less guided surgeries. However; we still do not know how to address technical difficulties during a surgical procedure. Moreover, complicated surgical interventions needs cognitive as well as decision making skills. A small change i.e. lifestyle or diet can play an important role in overall health. Many of the programs have not developed considering all aspects of life. Moreover who will take responsibility if a surgery fails due to poor judgment? Also in a medical crisis i.e. death or wrong diagnosis who will be responsible? Is it the doctor or the hospital with AI enabled medical device?

We explained earlier how well-designed machine-learning algorithms are proven to be useful for accurately predicting and treating cancer patients. However, clinical prediction models are far from perfect as the there are many variations including food habit, behavior and socio-economical conditions, lifestyle etc. We all aware about how machine learning failed to judge in a beauty contest as it discriminated against people with dark skin, as the developers did not include enough minorities when building the system. Therefore; we need to be careful that a small variation can change the outcome.

We have no clue how privacy will be protected and how the benefits can be fairly shared across the society (as these machines will be expensive). The benefit should not be only restricted to the privileged class. The regulatory agencies like FDA, HIPAA (Health Insurance Portability and Accountability Act) does not have enough technical experts to match the progress in AI research and innovation. Therefore, the approval of AI based healthcare technology is really slow. To be fair with FDA, it is true that there is a lack of widely accepted methods and standards to check the regulatory aspect of AI based healthcare software or devices. But considering the immense cost/benefit tradeoffs and overall impact for the society, they need to find out a way to evaluate and progress towards faster approval of new treatments and interventions (3). A report from the Executive Office of the President National Science and the Technology Council Committee on Technology notes that AI has the potential to contribute significantly to the public good, but that “an AI-enabled world demands a data-literate citizenry that is able to read, use, interpret, and communicate about data, and participate in policy debates about matters affected by AI.” (2). The report further suggests “As the technology of AI continues to develop, practitioners must ensure that AI-enabled systems are governable; that they are open, transparent, and understandable; that they can work effectively with people; and that their operation will remain consistent with human values and aspirations,” the report continues. “Researchers and practitioners have increased their attention to these challenges and should continue to focus on them.” (2).

Conclusion

Induction of AI in healthcare has already taken a central stage and in future it has the potential to radically transform healthcare. The technological advancement in mobile computing, artificial neural networks, robotics, storage of huge data in internet, cloudbased machine learning and information processing algorithms etc. has propelled the use of AI in modern healthcare. In future we will be encountering the medical equipment those are human-aware and have the characteristics to interact with people whom they are meant to interact. Because of AI, the modern healthcare can now become data-driven or data-oriented and personalized.

Acknowledgment

This publication was supported by an Institutional Development Award (IDeA) from the National Institute of General Medical Sciences of the National Institutes of Health under grant number “P20 GM109005”

References

  1. Chen R, Snyder M (2013) Promise of personalized omics to precision medicine. Wiley Interdiscip Rev Syst Biol Med 5(1): 73-82.
  2. Dilsizian SE, Siegel EL (2014) Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment. Curr Cardiol Rep 16(1): 441.
  3. Thompson SE, Parthasarathy S (2006) Moore’s law: the future of Si microelectronics. Materialstoday 9(6): 20-25.
  4. The Future of AI Is Neuromorphic.
  5. DeCanio SJ (2016) Robots and humans – complements or substitutes? Journal of Macroeconomics 49: 280–291.
  6. (2016) One Hundred Year Study on Artificial Intelligence. Stanford University.
  7. Dickson B (2017) How artificial intelligence is revolutionizing healthcare.
  8. Esteva A, Kuprel B, Novoa RA, Ko J, Swetter SM, et al. (2017) Dermatologist- level classification of skin cancer with deep neural networks. Nature 542: 115-118.
  9. Mnih V, Kavukcuoglu K, Silver D, Rusu AA, Veness J, et al. (2015) Humanlevel control through deep reinforcement learning. Nature 518: 529-533.
  10. Silver D, Huang A, Maddison CJ, Guez A, Sifre L, et al. (2016) Mastering the game of Go with deep neural networks and tree search. Nature 529: 484-489.
  11. Hirschberg J, Manning CD (2016) Advances in natural language processing. Science 349: 261-66.
  12. Das S, Dey A, Pal A, Roy N (2015) Applications of Artificial Intelligence in Machine Learning: Review and Prospect. International Journal of Computer Applications 115(9): 31-41.
  13. Horvitz Eric (2006) Machine learning, reasoning, and intelligence in daily life: Directions and challenges. USA.
  14. Le Cun Y, Bengio Y, Hinton G.Deep learning. Nature. 2015 May 28;521(7553):436-44.
  15. Arora M, Dhawan S, Singh K Deep Learning: Overview, Architecture, framework and applications. International Journal of Latest Trends in Engineering and Technology 10(1): 379-384.
  16. Yuxi Li (2017) Deep Reinforcement Learning: An Overview.
  17. Ipeirotis PG, Paritosh PK (2011) Managing Crowdsourced Human Computation.
  18. Calimera A, Macii E, Poncino M (2013) The Human Brain Project and neuromorphic computing. Funct Neurol 28(3):191-196.
  19. Torresen J (2018) A Review of Future and Ethical Perspectives of Robotics and AI. Front. Robot. AI.
  20. Ovidiu Vermesan, Joel Bacquet (2017) Cognitive Hyperconnected Digital Transformation - Internet of Things Intelligence Evolution. River Publishers.
  21. L Atzori, A Iera, G Morabito (2017) Understanding the Internet of Things: definition, potentials, and societal role of a fast evolving paradigm. Ad Hoc Networks 56: 122-140.
  22. Ziegeldorf JH, Garcia Morchon O, Wehrle K (2014) Privacy in the Internet of Things: threats and challenges. Communication and Distributed Systems 7(12): 2728-2742.
  23. Jun Zhang, Gui Yun Tian , Adi MJ Marindra, Ali Imam Sunny, Ao Bo Zhao (2017) A Review of Passive RFID Tag Antenna-Based Sensors and Systems for Structural Health Monitoring Applications. Sensors (Basel) 17(2): 265.
  24. McCarthy J, Marvin L Minsky, Rochester M, Shannon CE (1955) A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence. AI Magazine 27(4): 12-14.
  25. Turing AM (1950) Computing Machinery and Intelligence. Mind 49(236): 433–460.
  26. Newell A, Shaw JC, Simon HA (1959) Report on a general problemsolving program, Proceedings of the International Conference on Information Processing, UNESCO, Paris: 256–264.
  27. Selfridge OG (1959) Pandemonium: A paradigm for learning. Proceedings of the Symposium on Mechanization of Thought Processes: 511–529.
  28. Nilsson NJ (1984) Shakey the robot. SRI report.
  29. Ferrucci DA (2012) Introduction to ‘This is Watson’. IBM Journal of Research and Development 56(3.4).
  30. Zweig M, Tran D. The AI/ML use cases investors are betting on in healthcare.
  31. Kanevsky J, Corban J, Gaster R, Kanevsky A, Lin S, et al. (2016) Big Data and Machine Learning in Plastic Surgery: A New Frontier in Surgical Innovation. Plast Reconstr Surg 137(5): 890-897.
  32. Maeso S, Reza M, Mayol JA, Blasco JA, Guerra M, et al. (2010) Efficacy of the Da Vinci surgical system in abdominal surgery compared with that of laparoscopy: a systematic review and meta-analysis. Ann Surg 252(2): 254-262.
  33. Ishikawa N, Watanabe G, Hirano 1, Inaki 1, Kawachi K, et al. (2007) Robotic dexterity: evaluation of three-dimensional monitoring system and non-dominant hand maneuverability in robotic surgery. J Robot Surg 1(3): 231-3.
  34. Rudovic O, Lee J, Dai M, Schuller B, et al. (2018) Personalized machine learning for robot perception of affect and engagement in autism therapy. Science Robotics 3(19): eaao6760.
  35. Van den berghe ME, Scott ML, Scorer PW, Balcerzak D, et al. (2017) Relevance of deep learning to facilitate the diagnosis of HER2 status in breast cancer. Scientific Reports 7: 45938.
  36. Artificial Intelligence has to and will redesign healthcare.
  37. Panahiazar M, Taslimitehrani V, Pereira N, Pathak J (2015) Using EHRs and Machine Learning for Heart Failure Survival Analysis. Stud Health Technol Inform 216: 40-44.
  38. Weng SF, Reps J, Kai J, Garibaldi JM, et al. (2017) Can machine-learning improve cardiovascular risk prediction using routine clinical data? PLoS One 12(4): e0174944.
  39. Pastur Romay LA, Cedrón F, Pazos A, Porto Pazos AB (2016) Deep Artificial Neural Networks and Neuromorphic Chips for Big Data Analysis: Pharmaceutical and Bioinformatics Applications. Int J Mol Sci 17(8): E1313.
  40. Peng H, Zhou J, Zhou Z, Bria A, et al. (2016) Bioimage Informatics for Big Data. Adv Anat Embryol Cell Biol 219: 263-272.
Citation
Keywords
Signup for Newsletter
Scroll to Top