Computer Science, Artificial Intelligence and Data Science / en How NASA is Introducing AI Technologies Usage on Earth and in Space Exploration /blog/how-nasa-is-using-and-advancing-ai-on-earth-and-in-space-exploration How NASA is Introducing AI Technologies Usage on Earth and in Space Exploration <span><span lang about="/user/826" typeof="schema:Person" property="schema:name" datatype>zqureshi</span></span> <span><time datetime="2025-04-17T12:15:21-04:00" title="Thursday, April 17, 2025 - 12:15">April 17, 2025</time><br><br> </span> <img loading="lazy" src="/sites/default/files/how%20ai%20is%20being%20used%20in%20space%20exploration.jpeg" width="5120" alt="scientists working with satellites and AI technology" typeof="foaf:Image"> <p><span><span><span><span><span><span>Artificial intelligence (AI) isn’t just changing the way we do things on Earth, it’s also transforming how we approach the universe. NASA has been exploring the power of AI for years and recent developments are pushing the boundaries of what’s possible in space exploration and scientific discovery even further. From autonomous rovers on Mars to AI-enhanced efforts to find new exoplanets, our understanding of space is increasingly powered by machine learning and automation.</span></span></span></span></span></span></p> <p><span><span><span><strong><span><span>AI in Space Exploration: Mars and Beyond</span></span></strong></span></span></span></p> <p><span><span><span><span><span><span>One of the most notable examples of NASA’s AI capabilities is the Perseverance rover on Mars. Unlike earlier rovers that relied more on manual human input, Perseverance relies heavily on AI to navigate the Martian surface independently and in real-time. It’s equipped with an instrument called P<a href="https://www.nasa.gov/missions/mars-2020-perseverance/perseverance-rover/heres-how-ai-is-changing-nasas-mars-rover-science/">IXL (Planetary Instrument for X-ray Lithochemistry) that uses AI</a> to search for signs of ancient life by targeting and analyzing rock samples based on curated data from previous missions.&nbsp;</span></span></span></span></span></span></p> <p><span><span><span><span><span><span><a href="https://www.slashgear.com/1768440/nasa-space-exploration-how-ai-changing-methods-tools-used/">NASA’s Curiosity rover also uses AI to operate its laser</a>, selecting targets for chemical analysis autonomously. This is crucial for missions where human intervention is less feasible or more time-consuming.</span></span></span></span></span></span></p> <p><span><span><span><span><span><span>Beyond exploring our closest planetary neighbor, AI is also helping scientists find planets light-years away. <a href="https://www.jpl.nasa.gov/news/new-deep-learning-method-adds-301-planets-to-keplers-total-count/">NASA’s ExoMiner deep learning system recently identified 301 new exoplanets</a> by analyzing data from the Kepler Space Telescope. ExoMiner works by recognizing patterns in vast amounts of data, sifting through noise to spot planets that would otherwise be overlooked, and making decisions that are more accurate than human and machine combined.</span></span></span></span></span></span></p> <p><span><span><span><span><span><span>NASA’s telescopes collect massive amounts of data about deep space every day, and AI can help process this data faster and allow for more accurate analysis. For example, <a href="https://www.nccs.nasa.gov/about-us/internships/intern-bios/adam-2020">machine learning models can analyze</a> light curves to predict cosmic events like supernovae and gamma ray bursts. AI has also been used to identify gravitational waves caused by massive cosmic events, quickly analyzing data from observatories like LIGO and Virgo.</span></span></span></span></span></span></p> <p><span><span><span><strong><span><span>Using AI for Smarter Spacecraft, Satellites, and More</span></span></strong></span></span></span></p> <p><span><span><span><span><span><span>AI is making spacecraft and satellites more autonomous and efficient. <a href="https://www.nasa.gov/general/2024-ai-use-cases/">NASA’s ASPEN (Automated Scheduling and Planning Environment) system</a> helps plan and adjust mission operations. AI algorithms monitor spacecraft health, predict system failures, and even automate repairs when possible.</span></span></span></span></span></span></p> <p><span><span><span><span><span><span>The U.S. Space Force has also embraced AI for satellite operations that <a href="https://www.airandspaceforces.com/space-force-from-ai-pause-to-satellite-ops/">automate data collection, detect anomalies, and improve satellite positioning</a>. AI-driven models are also tracking orbital debris to help protect satellites and spacecraft from impending collisions. Similarly, the <a href="https://www.esa.int/Enabling_Support/Preparing_for_the_Future/Discovery_and_Preparation/Artificial_intelligence_in_space">European Space Agency is using AI to control satellite constellations and filter data before transmission</a>, reducing the burden on ground station personnel and increasing mission efficiency.</span></span></span></span></span></span></p> <p><span><span><span><span><span><span>Beyond space missions, NASA’s AI research is also improving life on Earth. Through a partnership with IBM, <a href="https://www.innovationnewsnetwork.com/how-nasa-is-utilising-ai-technologies-on-earth-and-in-space/53637/">NASA uses AI to analyze climate patterns</a> and predict extreme weather events to better prepare us before a disaster strikes. <a href="https://www.forbes.com/councils/forbestechcouncil/2025/03/04/cyber-resilience-in-aerospace-and-space-lessons-from-incident-response-failures-and-ai-driven-solutions/">AI is also being used in cybersecurity operations</a> to protect against threats and mitigate damage from successful attacks.</span></span></span></span></span></span></p> <p><span><span><span><strong><span><span>A Vision for 2040 and Beyond</span></span></strong></span></span></span></p> <p><span><span><span><span><span><span>NASA is setting its sights on the future with the <a href="https://ntrs.nasa.gov/api/citations/20180002010/downloads/20180002010.pdf">NASA 2040 AI Track</a>, an initiative focused on advancing AI in space exploration. Launched in 2024, this effort aims to enhance AI’s role in autonomous decision-making, spacecraft navigation, and scientific discovery.</span></span></span></span></span></span></p> <p><span><span><span><span><span><span>To support these goals, NASA established the AI Strategy Team, which focuses on integrating AI more deeply into missions. The team is working to develop AI systems that can handle complex, real-time scenarios such as adjusting a rover’s path on a distant planet or responding to unexpected hazards. By developing these capabilities, NASA is positioning AI as a key partner in future space missions, ensuring more efficient and autonomous operations in deep space exploration.</span></span></span></span></span></span></p> <h3><span><span><span><span><em><span>“It is important to see AI not as a threat that will replace the work of humans but as a tool to make our work easier and more efficient.” —David Salvagnini, Chief Data Officer and Chief Artificial Intelligence Officer, NASA</span></em></span></span></span></span></h3> <p><span><span><span><strong><span><span>Exploring New Frontiers in Space and AI at Capitol Tech</span></span></strong></span></span></span></p> <p><span><span><span><span><span><span>Capitol Technology University offers undergraduate and graduate programs in&nbsp;<a href="/fields-of-study/aviation-and-astronautical-sciences">Astronautical Engineering</a>&nbsp;that prepare you for an impactful and long-lasting career in the field. Our&nbsp;<a href="/student-experience/centers-and-labs/space-flight-operations-training-center-sfotc">student centers and labs</a>, as well as our on-campus&nbsp;<a href="/student-experience/centers-and-labs/alpha-observatory">ALPHA Observatory</a>, provide&nbsp;<a href="/student-experience/builder-culture/student-projects">hands-on experience with satellite ground stations and balloon payloads</a>, imparting the technical skills students need to succeed.</span></span></span></span></span></span></p> <p><span><span><span><span><span><span>Our <a href="/fields-of-study/computer-science-artificial-intelligence-and-data-science">degree programs in AI</a>, including the <a href="/degrees-and-programs/bachelors-degrees/artificial-intelligence-bs">first-of-its-kind Bachelor of Science in AI in Maryland</a>, prepare you to explore the many facets, skills, and ethics involved in this new technology, as well as the many ways AI is being used across a spectrum of industries. Our&nbsp;<a href="/student-experience/centers-and-labs/ai-center-of-excellence-aice">AI Center of Excellence</a> also fosters a wider and more dynamic ecosystem for research, education, and industry collaboration.</span></span></span></span></span></span></p> <hr> <h4><span><span><span><strong><span>Want to learn more? We invite you to attend our <a href="/degrees-and-programs/stem-events/greycon">GreyCon Conference on July 15</a> as an opportunity to explore this new frontier in technology emerging at the intersection of space, AI, and cybersecurity.&nbsp;</span></strong></span></span></span></h4> <p><span><span><span><span><span><span>To discover our academic programs,&nbsp;<a href="mailto:admissions@captech.edu">contact our Admissions team</a>&nbsp;or&nbsp;<a href="/request-information">request more information</a>.</span></span></span></span></span></span></p> <p>&nbsp;</p> <p><em><span><span><span><span><span><span>Edited by Erica Decker</span></span></span></span></span></span></em></p> Categories: <a href="/blog/category/artificial-intelligence" hreflang="en">Artificial Intelligence</a>, <a href="/blog/category/astronautical-engineering" hreflang="en">Astronautical Engineering</a>, <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-50986" class="section background-white"> <div class="super-contained"> </div> </section> Thu, 17 Apr 2025 16:15:21 +0000 zqureshi 12926 at Hybrid Quantum-Classical Machine Learning: Introduction /blog/hybrid-quantum-classical-machine-learning-introduction Hybrid Quantum-Classical Machine Learning: Introduction <span><span lang about="/user/69196" typeof="schema:Person" property="schema:name" datatype>emdecker</span></span> <span><time datetime="2024-05-16T11:35:31-04:00" title="Thursday, May 16, 2024 - 11:35">May 16, 2024</time><br><br> </span> <img loading="lazy" src="/sites/default/files/Quantum%20Machine%20Learning%20White%20Paper%20Blog_0.jpg" width="640" alt="Quantum Machine Learning White Paper Blog" typeof="foaf:Image"> <p class="paragraph"><span><span><span><span><span>In this first of a series of special guest blogs, Dr Alexander Perry,&nbsp;Adjunct Professor specializing in Computer Science and Quantum Computing at Capitol Technology University, provides his white paper insights on his experience with hybrid quantum-classical machine learning (HQML).</span></span>&nbsp;</span></span></span></p> <p class="paragraph"><span><span><span><span><span>To learn more about this exciting field of study, read below and be sure to join Dr. Perry on Thursday, May 23 at 12:00 p.m. (EST) for his Cap Tech Talk webinar, “Introduction to Hybrid Quantum-Classical Machine Learning”. </span></span><a href="/webinars-and-podcasts/cap-tech-talks-webinars/introduction-hybrid-quantum-classical-machine" target="_blank"><span><span><span>Registration Link</span></span></span></a><span><span>.</span></span>&nbsp;</span></span></span></p> <hr> <p class="paragraph"><span><span><span><em><span><span>By Dr. Alexander Perry</span></span></em>&nbsp;</span></span></span></p> <p><span><span><span>Some of the latest buzz terms in global industry and governments are “machine learning”, “classical computers” (think desktop or server), and “noisy intermediate-scale quantum (NISQ)”.&nbsp; Industry and governments are pouring millions of dollars into quantum applications to achieve increased performance and solve classically intractable challenges.&nbsp; Enhancing classical machine learning is an application seeing keen interest.</span></span></span></p> <p><span><span><span>This blog is the first in a series where I will give my thoughts and opinions on hybrid quantum-classical machine learning (HQML).&nbsp; I will base my thoughts on existing works, the relevance of HQML, and uses cases in cybersecurity.&nbsp; I will then propose a prototype implementation framework.</span></span></span></p> <figure role="group"> <div alt="NQIAC Presentation - Slide 32" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;&quot;,&quot;image_loading&quot;:{&quot;attribute&quot;:&quot;lazy&quot;}}" data-entity-type="media" data-entity-uuid="101aabee-f3b2-4693-ba9d-cce73ce68098" data-langcode="en" title="NQIAC Presentation - Slide 32" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/NQIAC%20Presentation%20-%20Slide%2032.png" alt="NQIAC Presentation - Slide 32" title="NQIAC Presentation - Slide 32" typeof="foaf:Image"> </div> <figcaption>(Image Credit: 2022, National Quantum Initiative Advisory Committee, Slide 32, https://www.quantum.gov/wp-content/uploads/2023/01/NQIAC-Slides-2022-12-16.pdf)</figcaption> </figure> <p>&nbsp;</p> <p><span><span><span>Hybrid quantum-classical machine learning (HQML) augments conventional machine learning with a quantum processing device, component, or unit (we will call QPU for simplicity).&nbsp; HQML is expected to have applications in fields such as cybersecurity (i.e. data classification) and quantum chemistry (i.e. drug discovery).&nbsp; In cybersecurity, HQML is expected to help classify large volumes of network traffic, among other applications, faster than classical/conventional approaches alone.</span></span></span></p> <figure role="group"> <div alt="NQIAC Presentation - Slide 64" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;&quot;,&quot;image_loading&quot;:{&quot;attribute&quot;:&quot;lazy&quot;}}" data-entity-type="media" data-entity-uuid="bb0338c0-e475-4c5a-9d8a-fe4d00e8cdd6" data-langcode="en" title="NQIAC Presentation - Slide 64" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/NQIAC%20Presentation%20-%20Slide%2064.png" alt="NQIAC Presentation - Slide 64" title="NQIAC Presentation - Slide 64" typeof="foaf:Image"> </div> <figcaption>(Image Credit: 2022, National Quantum Initiative Advisory Committee, Slide 64, https://www.quantum.gov/wp-content/uploads/2023/01/NQIAC-Slides-2022-12-16.pdf)</figcaption> </figure> <p>&nbsp;</p> <p><span><span><span>QPUs are currently implemented as NISQ devices from companies including IBM, Quantinuum, and QuEra through cloud-based QPU services or physical devices.&nbsp; NISQ devices are limited-scale quantum devices/computers (LSQD) impaired by environmental noise.&nbsp; Providers offer open-source software libraries in languages like Python to augment and enhance classical machine learning.&nbsp; </span></span></span></p> <figure role="group"> <div alt="ArXiv Image on Page 2" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;&quot;,&quot;image_loading&quot;:{&quot;attribute&quot;:&quot;lazy&quot;}}" data-entity-type="media" data-entity-uuid="02568084-3efd-48cb-a881-76e3c2b952e2" data-langcode="en" title="ArXiv Image on Page 2" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/ArXiv%20Image%20on%20Page%202.png" alt="ArXiv Image on Page 2" title="ArXiv Image on Page 2" typeof="foaf:Image"> </div> <figcaption>(Image Credit: 2019, Cornell University, Arxiv, Page 2, https://arxiv.org/pdf/1906.07682)</figcaption> </figure> <p>&nbsp;</p> <p><span><span><span>Knowing this, we will examine and apply HQML with NISQ to real-world cybersecurity challenges in a series of blogs composed of three sections: </span></span></span></p> <ol> <li><span><span><span>HQML, Who Cares?</span></span></span></li> <li><span><span><span>HQML Case Studies Explored</span></span></span></li> <li><span><span><span>Proposal: A Scalable Prototyping Framework</span></span></span></li> </ol> <p><span><span><span>The first section will explore the relevance of HQML on limited-scale quantum computers using a modified version of the <a href="https://www.darpa.mil/work-with-us/heilmeier-catechism">Heilmeier Catechism</a>:</span></span></span></p> <ul> <li><span><span><span>What is HQML via Limited-Scale Quantum Computing?&nbsp; </span></span></span></li> <li><span><span><span>Who cares? If you are successful, what difference will it make?&nbsp; </span></span></span></li> <li><span><span><span>What are you trying to do?&nbsp; </span></span></span></li> <li><span><span><span>How is it done today, and what are the limits of current practice?&nbsp; </span></span></span></li> <li><span><span><span>What is new in your approach and why do you think it will be successful?&nbsp; </span></span></span></li> <li><span><span><span>What are the risks?&nbsp; </span></span></span></li> <li><span><span><span>How much will it cost?&nbsp; </span></span></span></li> <li><span><span><span>How long will it take?&nbsp; </span></span></span></li> <li><span><span><span>What are the mid-term and final “exams” to check for success?</span></span></span></li> </ul> <p><span><span><span>The second section will present the findings of HQML case studies on relevant real-world challenges such as data classification and clustering.&nbsp; And the third section will outline a proposed scalable open-source framework for rapid prototyping of HQML applications.</span></span></span></p> <p><span><span><span>In review, enhancing classical machine learning using QPUs in HQML is seeing keen interest from industry and governments globally.&nbsp; Providers offer open-source software libraries in languages like Python to support this interest.&nbsp; In cybersecurity particularly, HQML is expected to help classify large volumes of network traffic faster than classical approaches.&nbsp; With this in mind, we will examine and apply HQML on LSQD to real-world cybersecurity challenges in a series of blogs.&nbsp; Stay tuned!!!</span></span></span></p> <hr> <p><strong><span><span><span>Author</span></span></span></strong></p> <p><span><span><span>Alexander Perry is an adjunct professor at <a href="http://www.captechu.edu">Capitol Technology University</a> and a data scientist performing applied research in hybrid quantum-classical machine learning (HQML).&nbsp; His expertise includes Cyber, Data Science, Artificial Intelligence/Machine Learning (AI/ML), and Quantum Computing.&nbsp; His previous roles have included software engineer, system administrator, data scientist, technical director, and data science team lead.</span></span></span></p> <hr> <p><strong>References&nbsp;</strong></p> <ol start="1"> <li> <p>Biamonte J., Wittek P., Pancotti N., Rebentrost P., Wiebe N., and Lloyd S. "Quantum machine learning", arXiv (2018). url: <a href="https://arxiv.org/abs/1611.09347" target="_blank">https://arxiv.org/abs/1611.09347</a> (34)&nbsp;</p> </li> </ol> <ol start="2"> <li> <p>&nbsp;Moler K., Tahan C., Abo-Shaeer J., Chong f., Clarke J., Frincke D., Herrera G., Mason N., Oliver W., Preskill J., Ritter M., Schoelkopf R., Svore K., Wang J., Ye J., and Wong T. et al. “NQIAC Slides 2022-12-16”. National Quantum Initiative Advisory Committee (NQIAC)(2022).&nbsp; url: <a href="https://www.quantum.gov/wp-content/uploads/2023/01/NQIAC-Slides-2022-12-16.pdf" target="_blank">https://www.quantum.gov/wp-content/uploads/2023/01/NQIAC-Slides-2022-12-16.pdf</a> (50)&nbsp;</p> </li> </ol> <ol start="3"> <li> <p>Department of Energy, Office of Science. "DOE Announces $24M for Research on Quantum Networks", HPC Wire (2023). url: <a href="https://www.hpcwire.com/off-the-wire/doe-announces-24m-for-research-on-quantum-networks/" target="_blank">https://www.hpcwire.com/off-the-wire/doe-announces-24m-for-research-on-quantum-networks/</a>&nbsp;</p> </li> </ol> <ol start="4"> <li> <p>Thomas W. “FY24 Budget Outlook: Department of Defense”, American Institute of Physics, FYI: Science Policy News (2023). url: <a href="https://ww2.aip.org/fyi/fy24-budget-outlook-department-of-defense" target="_blank">https://ww2.aip.org/fyi/fy24-budget-outlook-department-of-defense</a>&nbsp;</p> </li> </ol> <ol start="5"> <li> <p>Edwaeds J. “NIST Issues Congressionally Mandated Report on Emerging Tech Areas”, ExecutiveGov (2023). url: <a href="https://executivegov.com/2023/08/nist-issues-congressionally-mandated-report-on-emerging-tech-areas/" target="_blank">https://executivegov.com/2023/08/nist-issues-congressionally-mandated-report-on-emerging-tech-areas/</a>&nbsp;&nbsp;</p> </li> </ol> <ol start="6"> <li> <p>National Science Foundation. "NSF Invests $38M to Advance Quantum Information Science and Engineering", HPC Wire (2023). url: <a href="https://www.hpcwire.com/off-the-wire/nsf-invests-38m-to-advance-quantum-information-science-and-engineering/" target="_blank">https://www.hpcwire.com/off-the-wire/nsf-invests-38m-to-advance-quantum-information-science-and-engineering/</a>&nbsp;&nbsp;</p> </li> </ol> <ol start="7"> <li> <p>Williams A. "AFRL opens Extreme Computing centre for quantum computing research", Electronics Weekly (2023). url: <a href="https://www.electronicsweekly.com/news/research-news/afrl-opens-extreme-computing-centre-for-quantum-computing-defence-research-2023-08/" target="_blank">https://www.electronicsweekly.com/news/research-news/afrl-opens-extreme-computing-centre-for-quantum-computing-defence-research-2023-08/</a>&nbsp;&nbsp;</p> </li> </ol> <ol start="8"> <li> <p>Department of Energy, Office of Science. "DOE Announces $11.7 Million for Research on Quantum Computing", Department of Energy (2023). url: <a href="https://www.energy.gov/science/articles/department-energy-announces-117-million-research-quantum-computing" target="_blank">https://www.energy.gov/science/articles/department-energy-announces-117-million-research-quantum-computing</a>&nbsp;&nbsp;</p> </li> </ol> <ol start="9"> <li> <p>IBM Corp. "Truist and IBM Collaborate on Emerging Technology Innovation and Quantum Computing", HPC Wire (2023). url: <a href="https://www.hpcwire.com/off-the-wire/truist-and-ibm-collaborate-on-emerging-technology-innovation-and-quantum-computing/" target="_blank">https://www.hpcwire.com/off-the-wire/truist-and-ibm-collaborate-on-emerging-technology-innovation-and-quantum-computing/</a>&nbsp;&nbsp;</p> </li> </ol> <ol start="10"> <li> <p>Ambrose M. "Expansion of National Quantum Initiative Pitched to Science Committee", American Institute of Physics, FYI: Science Policy News (2023). url: <a href="https://ww2.aip.org/fyi/expansion-of-national-quantum-initiative-pitched-to-science-committee" target="_blank">https://ww2.aip.org/fyi/expansion-of-national-quantum-initiative-pitched-to-science-committee</a>&nbsp;</p> </li> </ol> <ol start="11"> <li> <p>Benedetti M., Lloyd E., Sack S., and Fiorentini M. "Parameterized quantum circuits as machine learning models", arXiv (2019). url: <a href="https://arxiv.org/abs/1906.07682" target="_blank">https://arxiv.org/abs/1906.07682</a> (2)&nbsp;</p> </li> </ol> <ol start="12"> <li> <p>Qiskit Machine Learning Development Team. "Qiskit: An Open-source Framework for Quantum Computing", IBM Corp. (2024), doi: 10.5281/zenodo.2573505, url: <a href="https://www.ibm.com/quantum/qiskit" target="_blank">https://www.ibm.com/quantum/qiskit</a>&nbsp;</p> </li> </ol> <ol start="13"> <li> <p>Ibrahim M. "Writing a Hybrid Quantum Algorithm using the Intel® Quantum SDK Beta Version", Linkedin (2022). url: <a href="https://www.linkedin.com/pulse/writing-hybrid-quantum-algorithm-using-intel-sdk-beta-ibrahim/" target="_blank">https://www.linkedin.com/pulse/writing-hybrid-quantum-algorithm-using-intel-sdk-beta-ibrahim/</a> (133)&nbsp;</p> </li> </ol> <ol start="14"> <li> <p>Chang D. "Parameterized Quantum Circuits with Quantum Kernels for Machine Learning: A Hybrid Quantum-Classical Approach", arXiv (2022). url: <a href="https://arxiv.org/abs/2209.14449" target="_blank">https://arxiv.org/abs/2209.14449</a> (3)&nbsp;</p> </li> </ol> <ol start="15"> <li> <p>Britt K. and Humble T. "High-Performance Computing with Quantum Processing Units", arXiv (2015). <a href="https://arxiv.org/abs/1511.04386" target="_blank">https://arxiv.org/abs/1511.04386</a>&nbsp;</p> </li> </ol> <ol start="16"> <li> <p>Schuld M. and Killoran N. "Quantum machine learning in feature Hilbert spaces", arXiv (2018). url: <a href="https://arxiv.org/abs/1803.07128" target="_blank">https://arxiv.org/abs/1803.07128</a>&nbsp;</p> </li> </ol> <ol start="17"> <li> <p>Heilmeier G. “The Heilmeier Catechism”, Defense Advanced Research Projects Agency (DARPA, 1976). url: <a href="https://www.darpa.mil/work-with-us/heilmeier-catechism" target="_blank">https://www.darpa.mil/work-with-us/heilmeier-catechism</a>&nbsp;</p> </li> </ol> <p>&nbsp;</p> Categories: <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-48076" class="section background-white"> <div class="super-contained"> </div> </section> Thu, 16 May 2024 15:35:31 +0000 emdecker 11731 at Neuralink's Brain Chip: How It Works and What It Means /blog/neuralinks-brain-chip-how-it-works-and-what-it-means Neuralink's Brain Chip: How It Works and What It Means <span><span lang about="/user/68991" typeof="schema:Person" property="schema:name" datatype>bcook</span></span> <span><time datetime="2024-02-09T12:49:08-05:00" title="Friday, February 9, 2024 - 12:49">February 9, 2024</time><br><br> </span> <img loading="lazy" src="/sites/default/files/BLOG%20IMAGE%20SIZE%20%2851%29_0.png" width="640" alt="neuralink logo" typeof="foaf:Image"> <p>Elon Musk recently announced that Neuralink, his company aiming to revolutionize brain-computer interfaces (BCIs), <a href="https://www.nature.com/articles/d41586-024-00304-4" target="_blank">has successfully implanted a brain chip in a human</a> for the first time. The implantation of the device, called “the Link,” represents a leap forward in the realm of BCIs, which record and decode brain activity, that may allow for new innovations in health care, communication, and cognitive abilities.&nbsp;</p> <p>Though limited information on the technology is available and Neuralink’s claims have not been independently verified, here’s a look at the Link, its functionality, and the potential implications of this groundbreaking innovation.&nbsp;</p> <p>&nbsp;</p> <h2>The Technology Behind the Neuralink Chip&nbsp;</h2> <p><a href="https://neuralink.com/" target="_blank">Described as fully implantable and "cosmetically invisible</a><a href="https://neuralink.com/">"</a>, the Neuralink chip uses thin, flexible threads equipped with 1,024 electrodes that record the activity of neurons, the nerve cells that send messages all over the body to drive nearly all human functions. The coin-sized device is powered by an advanced custom chip within the implant that processes these signals and transmits them to a digital device through a standard Bluetooth connection –– a novel step in BCI development. Surgical robots meticulously weave these threads into the cerebral cortex, which is responsible for the brain’s higher-level processes like learning and emotion, to ensure precise placement of the electrodes.&nbsp;</p> <h2>&nbsp;</h2> <h2>How the Neuralink Chip Works&nbsp;</h2> <p>Initially focusing on aiding individuals with severe paralysis, the Neuralink chip aims to restore personal control over limbs, prosthetics, or communication devices. By recording and decoding neural signals from individual neurons and then <a href="https://builtin.com/hardware/what-is-neuralink" target="_blank">transmitting them back to the brain using electrical stimulation</a>, the chip enables users to control devices solely through thought. Compared to other BCIs, Neuralink's approach targets individual neurons, providing crucial data for sophisticated thought-decoding. Notably, the company has also developed a robot that can surgically implant the device with more precision and efficiency than a human surgeon.&nbsp;</p> <p>According to neurobiologists, Neuralink’s device does not offer much in the way of new technological developments –– several companies have been developing surface electrodes offering similar technology for decades. However, Neuralink’s innovation is that its device packages many existing technologies into a single system and connects electrodes with individual neurons.&nbsp;&nbsp;</p> <p>&nbsp;</p> <h2>The Future of Implantable Devices like Neuralink</h2> <p>If the technology proves successful, the company hopes to create <a href="https://www.npr.org/2024/01/30/1227850900/elon-musk-neuralink-implant-clinical-trial" target="_blank">direct brain-to-computer interfaces</a> that connect a person’s thoughts to digital devices. Musk’s long-term vision is to combine human consciousness with artificial intelligence, a claim that has drawn considerable skepticism from scientists.&nbsp;</p> <p>In the medical realm, Neuralink could offer new avenues for treating neurological disorders like Parkinson's disease. It could be used to control exoskeletons and prosthetics that could restore movement in individuals with paralysis or amputations. The technology also opens doors to human enhancement through memory augmentation and enhanced cognitive abilities.&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <h2>Managing Ethical Concerns of Neuralink</h2> <p>While Neuralink’s trial was approved by the USDA, it is unregistered in the National Institutes of Health’s online clinical trial database. Little information on the tests has been released publicly, and the company has not shared where implants are being done and the outcomes they are assessing. However, researchers speculate that the company will likely test which chips show the best performance, are the most durable, and have the best user experience. With the initial trial scheduled to last five years, the chip’s long-term functionality will be key to its success, as replacing electrodes after implantation is unlikely.&nbsp;&nbsp;</p> <p>Despite the device’s technological promise, significant ethical concerns have been raised regarding privacy, surveillance, and societal impacts. Previous trials on monkeys and pigs showed promising developments but also <a href="https://www.aljazeera.com/news/2024/1/31/what-is-elon-musks-neuralink-brain-chip-now-being-tested-on-humans#:~:text=The%20disk%20would%20register%20brain,by%20thinking%E2%80%9D%2C%20said%20Musk." target="_blank">reported issues of paralysis and seizures</a>. Ensuring equitable access and addressing potential disparities will also be imperative. And with long-term effects still unknown, neurotechnology researchers position that safety and transparency in clinical trials remain vital considerations.&nbsp;</p> <p>Neuralink's development signifies the potential transition from external wearables to internal implants that could usher in a new era of deeper technological integration with the nervous system. While this new brain chip represents a significant advancement in BCIs, scientists and neurologic experts emphasize that it’s crucial to approach it with cautious optimism. Understanding the technology, its current capabilities, and potential future implications is essential for responsible development and ethical implementation. As human trials begin and the technology progresses, addressing ethical concerns and managing expectations will be key to realizing its full potential.&nbsp;</p> <p>Capitol Tech’s programs in <a href="/fields-of-study/computer-science-artificial-intelligence-and-data-science" target="_blank">Computer Science, Artificial Intelligence and Data Science</a> and <a href="/fields-of-study/engineering-technologies" target="_blank">Engineering Technologies</a> can prepare you to develop new technologies that will shape the future of society. For more information on Capitol Technology’s degree programs, contact our Admissions team at admissions@captechu.edu.&nbsp;</p> Categories: <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-46776" class="section background-white"> <div class="super-contained"> </div> </section> Fri, 09 Feb 2024 17:49:08 +0000 bcook 11371 at AI and Counterterrorism: Potential, Pitfalls, and the Path Forward /blog/ai-and-counterterrorism-potential-pitfalls-and-path-forward AI and Counterterrorism: Potential, Pitfalls, and the Path Forward <span><span lang about="/user/68991" typeof="schema:Person" property="schema:name" datatype>bcook</span></span> <span><time datetime="2024-02-01T15:55:10-05:00" title="Thursday, February 1, 2024 - 15:55">February 1, 2024</time><br><br> </span> <img loading="lazy" src="/sites/default/files/BLOG%20IMAGE%20SIZE%20%2850%29_0.png" width="640" alt="woman in front of map screens" typeof="foaf:Image"> <p>As artificial intelligence (AI) transforms nearly every aspect of modern life, it’s ushered in new opportunities, efficiencies, and conveniences. It’s also bolstering efforts to enhance digital and national security.&nbsp;</p> <p>AI is now poised to alter the landscape of counterterrorism, offering an arsenal of analytical tools and predictive capabilities. However, its integration into this sensitive domain raises several ethical and practical concerns that must be examined and legislated to ensure its responsible use. While the potential benefits of AI in counterterrorism are undeniable, overlooking its potential drawbacks could have far-reaching consequences.&nbsp;</p> <p>&nbsp;</p> <h2>Unleashing the Potential of AI in Counterterrorism</h2> <p>AI’s power lies in its ability to process information and gather insights that are near impossible for humans. Here’s a look at some of the ways AI is leading the way for a new world of counterterrorism.&nbsp;</p> <ul role="list"> <li> <p><em><strong>Data-driven insights:</strong></em> AI excels at navigating mountains of data, from financial transactions to social media activity, extracting hidden patterns and connections that human analysts might miss. This ability enables the <a href="https://www.mccormick.northwestern.edu/news/articles/2023/10/advancing-ai-systems-in-cybersecurity-counterterrorism-and-international-security/" target="_blank">prediction of radicalization pathways and potential events</a>, the disruption of terrorist funding networks, and the tracking of suspicious movements. For example, AI models can analyze vast quantities of financial data and piece together seemingly insignificant transactions that reveal a web of illicit transfers funding a terrorist organization.&nbsp;</p> </li> <li> <p><em><strong>Cybersecurity fortification:</strong></em> The digital world has become a battleground for terrorist groups, who use it for propaganda, recruitment, and communication. AI algorithms can serve as front-line defenders, detecting and stopping cyberattacks, identifying malicious online activity, and rooting out malicious accounts. For example, an AI system could monitor online communications to recognize and neutralize bots that are propagating radical rhetoric before they can infiltrate vulnerable communities.&nbsp;</p> </li> <li> <p><em><strong>Enhanced border security: </strong></em>AI-powered systems like facial recognition cameras can revolutionize border control by analyzing biometric data from travel documents and scanning video footage at checkpoints. This allows for the swift identification of individuals on watchlists or exhibiting suspicious behavior, streamlining security processes while significantly improving detection accuracy.&nbsp;</p> </li> <li> <p><em><strong>Resource optimization and risk assessment:</strong></em> By sifting through crime trends, threat reports, and intelligence data, AI can assist authorities in <a href="https://www.science.org/doi/10.1126/sciadv.abg4778" target="_blank">prioritizing resources and directing investigations toward the most pressing threats</a>. For example, an AI model that analyzes past attacks, social media posts, and suspicious financial activity can identify areas with the highest risk of imminent attacks, enabling authorities to allocate resources proactively.&nbsp;</p> </li> </ul> <h2>&nbsp;</h2> <h2>Navigating the Pitfalls of AI in Counterterrorism</h2> <p>Despite AI’s incredible potential and already significant impact, it comes with an incredible responsibility. Misuse of AI or having systems fall into the wrong hands are considerable concerns.&nbsp;</p> <ul role="list"> <li> <p><em><strong>Bias and discrimination:</strong></em> AI’s weakness lies in its dependence on data supplied by humans, which has built in biases and stereotypes. Biased data leads to biased algorithms, <a href="https://www.icct.nl/publication/states-prevention-terrorism-and-rule-law-challenging-magic-artificial-intelligence-ai#:~:text=Moreover%2C%20the%20use%20of%20AI,against%20particular%20individuals%20or%20groups" target="_blank">potentially profiling and targeting individuals or groups</a> based on factors like race, religion, or ethnicity. This discriminatory application of AI could erode trust in law enforcement and exacerbate existing societal inequalities.&nbsp;</p> </li> <li> <p><em><strong>Privacy concerns: </strong></em>Extensive data collection and analysis raise <a href="https://www.un.org/counterterrorism/sites/www.un.org.counterterrorism/files/countering-terrorism-online-with-ai-uncct-unicri-report-web.pdf" target="_blank">justifiable concerns about individual privacy and potential violations of civil liberties</a>. The use of facial recognition technology, social media monitoring, and other intrusive techniques in the name of counterterrorism must be balanced with robust safeguards to protect privacy rights. For example, AI-powered cameras designed to track terrorists could also track individuals across cities, creating detailed profiles of their movements and interactions, and raising unsettling questions about the erosion of personal liberties.&nbsp;</p> </li> </ul> <ul role="list"> <li> <p><em><strong>Transparency and accountability: </strong></em>AI models can be intricate and mysterious, making it challenging to explain their decision-making processes and address instances of bias or errors. For example, if an AI system flags an individual as a potential threat without providing any explanation or evidence, leaving them with no recourse to challenge the system's judgment, negative impacts are likely.&nbsp;</p> </li> <li> <p><em><strong>Misuse and weaponization: </strong></em>The potential for <a href="https://www.securitymagazine.com/articles/99908-ai-included-in-dhs-national-security-risks-for-2024" target="_blank">malicious actors to weaponize AI for evil purposes</a> cannot be ignored. The development of autonomous weapons systems powered by AI poses significant ethical and safety concerns, raising the potential for a dangerous AI arms race. If a terrorist organization gained access to and manipulated AI-powered systems, they could turn them to strengthen rather than protect against their attacks.&nbsp;</p> </li> </ul> <p>&nbsp;</p> <h2>Navigating the Path Forward for AI in Counterterrorism</h2> <p>The integration of AI into counterterrorism requires a delicate balancing act between security needs and <a href="/blog/ethical-considerations-of-artificial-intelligence">ethical considerations</a>. To unlock its potential while mitigating its risks, developers, legislators, and counterterrorism agencies <a href="https://www.bbc.com/news/technology-67872767" target="_blank">must define and evaluate standards to ensure its responsible use</a>.&nbsp;</p> <ul role="list"> <li> <p><strong><em>Ethical development and deployment:</em></strong> Comprehensive ethical frameworks must be established to ensure the responsible development and deployment of AI in counterterrorism, prioritizing fairness, transparency, and accountability. This includes regular audits of algorithms, public reporting on AI's use, and the establishment of independent oversight bodies.&nbsp;</p> </li> <li> <p><em><strong>Bias mitigation and fairness:</strong></em> Agencies must employ rigorous data curation processes to minimize biased training data and implement fairness-enhancing techniques to ensure unbiased decision-making. This requires collaboration between technologists, ethicists, and social scientists to develop robust solutions.&nbsp;</p> </li> <li> <p><em><strong>Privacy protection and civil liberties:</strong></em> Legislators must implement robust legal and regulatory frameworks to safeguard individual privacy rights and prevent the erosion of civil liberties while enabling effective counterterrorism measures. This includes data minimization practices, strong encryption standards, and clear guidelines on data retention and access.&nbsp;</p> </li> </ul> <p>AI, like all technological advancements, should be met with both optimism and the weight of responsibility. While AI’s potential to revolutionize counterterrorism is undeniable, technologists and legislators must prioritize the development and implementation of robust safeguards to mitigate the risks of bias, discrimination, and privacy violations. If these standards can be met and upheld, AI can exist in a counterterrorism context where security is assured and individual freedoms are protected.&nbsp;</p> <p>Capitol Tech’s programs in <a href="/fields-of-study/computer-science-artificial-intelligence-and-data-science" target="_blank">Computer Science, Artificial Intelligence and Data Science</a> can give you the technical and interpersonal skills required to ensure AI’s effective and responsible use in counterterrorism. Additionally, Capitol's cutting-edge <a href="/degrees-and-programs/bachelors-degrees/counterterrorism-bs">degree programs in Counterterrorism</a> can equip you with the knowledge and resources needed to protect our nation from both foreign and domestic threats. For more information on Capitol Technology’s degree programs, contact our Admissions team at <a href="admissions@captechu.edu">admissions@captechu.edu</a>.&nbsp;</p> Categories: <a href="/blog/category/counterterrorism" hreflang="en">Counterterrorism</a>, <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-46696" class="section background-white"> <div class="super-contained"> </div> </section> Thu, 01 Feb 2024 20:55:10 +0000 bcook 11351 at International Women’s Day Celebration Today /blog/international-womens-day-celebration-today International Women’s Day Celebration Today <span><span lang about="/user/69196" typeof="schema:Person" property="schema:name" datatype>emdecker</span></span> <span><time datetime="2022-03-08T15:24:38-05:00" title="Tuesday, March 8, 2022 - 15:24">March 8, 2022</time><br><br> </span> <img loading="lazy" src="/sites/default/files/Capitol%20Tech%20Women%20in%20STEM.jpg" width="640" alt="Capitol Tech Women in STEM" typeof="foaf:Image"> <p><span><span><span>Today, March 8<sup>th</sup>, is the annual observance of International Women’s Day, a day celebrating the significant impact of women to all aspects of our society. In addition to this single day observance, the entire month of March is also dedicated to honoring the influential women who have contributed to the important societal, economic, political, and cultural changes seen throughout history. This year’s themes are: Gender equality today for a sustainable tomorrow and #BreaktheBias. On social media, many are posting photos of themselves with arms crossed, symbolizing an “X”, as in X-chromosome and also “stop” the bias.</span></span></span></p> <p><span><span><span>Taking time to celebrate these contributions is important to the empowerment and encouragement of women to seize opportunities in historically male-dominated fields. Diversity in these fields notably increases creativity and cultural insight, ensuring representation and engagement of all genders, as well as drives the advancement of STEM areas.</span></span></span></p> <p><span><span><span>Over the month of March, Capitol Tech invites you to partake in our retrospective honoring the many amazing women who have contributed to innovations in STEM. The following biographies outline the lives of women who redefined their fields. From programmers, engineers, hackers, and celebrities, click through the links below to learn more about these pioneering women:</span></span></span></p> <p><br> <span><span><span><a href="/blog/amanda-finnerty-director-of-internal-operations-commodore-builders" target="_blank">Amanda Finnerty: Director of Internal Operations for Commodore Builders</a><br> <a href="/blog/danielle-dy-buncio-co-founder-and-ceo-of-viatechnik-construction-technology-firm" target="_blank">Danielle Dy Buncio: Co-founder and CEO of VIATechnik, a Construction Technology Firm</a><br> <a href="/blog/rebecca-clark-operations-executive-skanska-global-construction-firm" target="_blank">Rebecca Clark: Operations Executive for Skanska, a Global Construction Firm</a><br> <a href="/blog/dr-nina-tandon-co-founder-of-first-company-grow-human-bones-reconstruction" target="_blank">Dr. Nina Tandon: Co-Founder of the First Company to Grow Human Bones for Reconstruction</a><br> <a href="/blog/ada-lovelace-mother-of-computer-programming" target="_blank">Ada Lovelace: The Mother of Computer Programming</a><br> <a href="/blog/hedy-lamarr-star-of-silver-screen-and-inventor-of-wwii-changing-communications-device" target="_blank">Hedy Lamarr: Star of the Silver Screen and Inventor of a WWII Changing Communications Device</a><br> <a href="/blog/edith-clarke-trailblazing-leader-women-and-pioneer-computing-and-engineering" target="_blank">Edith Clarke: A Trailblazing Leader for Women and a Pioneer in Computing and Engineering</a><br> <a href="/blog/katherine-johnson-commemoration-of-mathematician-and-computer-scientist-responsible-first-us" target="_blank">Katherine Johnson: In Commemoration of the Mathematician and Computer Scientist Responsible for the first U.S. Moon Landing</a><br> <a href="/blog/ana-sol-gutierrez-i-wouldnt-follow-role-they-attributed-me" target="_blank">Ana Sol Gutierrez: "I wouldn’t follow the role they attributed to me"</a><br> <a href="/blog/kimberly-bryant-accomplished-electrical-engineer-and-founder-of-black-girls-code" target="_blank">Kimberly Bryant: Accomplished Electrical Engineer and Founder of Black Girls Code</a><br> <a href="/blog/sabrina-gonzalez-pasterski-young-woman-dubbed-next-albert-einstein" target="_blank">Sabrina Gonzalez Pasterski: The Young Woman Dubbed the “Next Albert Einstein”</a><br> <a href="/blog/velma-p-scantlebury-md-first-black-female-transplant-surgeon-us" target="_blank">Velma P. Scantlebury, M.D.: the first black female transplant surgeon in the U.S.</a><br> <a href="/blog/mae-jemison-doctor-teacher-founder-of-two-technology-companies-and-first-african-american" target="_blank">Mae Jemison: Doctor, Teacher, Founder of Two Technology Companies, and the First African-American woman in Space</a></span></span></span></p> <p><a href="/blog/girls-need-modems-battle-cry-of-hacktivist-jude-milhon">Judith Milhon: Programmer, Civil Rights Activist, Hacker</a></p> <p><a href="/blog/science-behind-jack-black">Judith Love Cohen: Aerospace Engineer, Feminist, Actor Jack Black’s Mother</a></p> <p>&nbsp;</p> <p><a href="/blog/celebrating-women-stem-during-womens-history-month-2021">To read about past Capitol Tech Women’s Day celebrations, click here</a>.</p> Categories: <a href="/taxonomy/term/39" hreflang="en">Cyber and Information Security</a>, <a href="/blog/category/data-analytics" hreflang="en">Data Analytics</a>, <a href="/taxonomy/term/42" hreflang="en">Engineering Technologies</a>, <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a>, <a href="/blog/category/women-in-stem" hreflang="en">Women in STEM</a> <section id="section-34796" class="section background-white"> <div class="super-contained"> </div> </section> Tue, 08 Mar 2022 20:24:38 +0000 emdecker 8451 at High-Performance Computing (HPC): Applications and Trends in Computer Science /blog/high-performance-computing-hpc-applications-and-trends-computer-science High-Performance Computing (HPC): Applications and Trends in Computer Science <span><span lang about="/user/69196" typeof="schema:Person" property="schema:name" datatype>emdecker</span></span> <span><time datetime="2022-03-07T14:06:44-05:00" title="Monday, March 7, 2022 - 14:06">March 7, 2022</time><br><br> </span> <img loading="lazy" src="/sites/default/files/pexels-panumas-nikhomkhai-1148820-server-HPC-computer.jpg" width="640" alt="Computer Server" typeof="foaf:Image"> <p><span><span><span>High-performance computing (HPC) is a relatively new application being used across many STEM fields of study. From bioinformatics and genetic research, to running artificial intelligence programs and space flight simulations, any instance where huge amounts of data and complex calculations need to be processed at high speeds is where HPC becomes not only useful, but necessary. </span></span></span></p> <p><span><span><span>Typical computers cannot handle the amount of “big data” generated by, say, sequencing the human genome, which can produce several terabytes of complex data. This and other types of research often require the more heavy-duty computing seen with HPC. HPC setups are comprised of a system of servers using supercomputers equipped with powerful processors, graphics cards, and memory. According to IBM, these setups can be one million times more powerful than the fastest personal laptop. With HPC, the ability to correctly process large amounts of data is as important as the ability to do so quickly. But this comes at a price, as some trade-off between speed and processing is believed to be unavoidable. However, a team of computer science research students and professors at Massachusetts Institute of Technology (MIT) are now revisiting this issue and have developed a promising solution to this problem.&nbsp;</span></span></span><span><span><span>Through their joint effort, they have developed a new programming language written specifically for HPC. And it all comes back to zeros and ones.</span></span></span></p> <p><span><span><span>“Everything in our language is aimed at producing either a single number or a tensor,” explains MIT PhD student Amanda Liu. This is by using what they call “a tensor language” or “ATL”. Tensors are <em>n</em>-dimensional arrays, which replace the need to use one-dimensional vector objects and two-dimensional matrices, and allow for more complex dimensions to be computed. And while this language optimization already exists in some form as “TensorFlow” in the well-known R and Python software, MIT Assistant Professor of Electrical Engineering &amp; Computer Science Jonathan Raglan-Kelly states that this language has been seen to cause slowdowns and complicate downstream optimizations, “violating the Cheap Gradients Principle”. According to Cornell University, this principle states that “the computational cost of computing the gradient of a scalar-valued function is nearly the same (often within a factor of 5) as that of simply computing the function itself…[and] is of central importance in optimization.” This is due to the way that programs like TensorFlow compute certain original functions against their gradient solutions. Thus, the need for a more specialized HPC language arose.</span></span></span></p> <p><span><span><span>Liu explains “given that high-performance computing is so resource-intensive, you want to be able to modify, or rewrite, programs into an optimal form in order to speed things up.” This can be accomplished by the added framework toolkit that comes with their ATL language, which shows the ways in which simplified program conversion can be attained. A “proof assistant” is included as well, which expands upon the existing Coq language and helps guarantee that the optimization is correct by performing mathematical proofs.</span></span></span></p> <p><span><span><span>While this language is still a prototype, there are indications that ATL could be the next avenue in HPC optimization, especially when approaching the more complex issue of cybersecurity in the HPC environment. Some emerging studies show promise in using tensor language decompositions in securing data in the cloud or on Amazon Web Services (AWS), but further research needs to be done in this area.</span></span></span></p> <p><span><span><span><span>Want to learn more about our </span><a href="/fields-of-study/computer-science-artificial-intelligence-and-data-science"><span>computer science, artificial intelligence</span></a><span>, and </span><a href="/fields-of-study/cyber-and-information-security"><span>cybersecurity</span></a><span> program offerings? Visit our </span><a href="/degrees-and-programs"><span>website</span></a><span> to learn more about Capitol Tech’s diverse degree programs, or contact&nbsp;</span><a href="https:///C:/Users/emdecker/AppData/Roaming/Microsoft/Word/admissions@captechu.edu"><span>admissions@captechu.edu</span></a><span>.</span></span></span></span></p> <p>&nbsp;</p> <p>&nbsp;</p> <p><span><span><span>References:</span></span></span></p> <p><span><span><span>Bernstein, G., Maria, M., Li, T., MacLaurin, D., and Ragan-Kelly, J. (2020). Di€erentiating A Tensor Language. Arxiv. <a href="https://arxiv.org/pdf/2008.11256.pdf">https://arxiv.org/pdf/2008.11256.pdf</a></span></span></span></p> <p><span><span><span>Cornell University. (2018). Mathematics &gt; Optimization and Control &gt; Provably Correct Automatic Subdifferentiation for Qualified Programs. Arxiv. <a href="https://arxiv.org/abs/1809.08530#:~:text=The%20Cheap%20Gradient%20Principle%20(Griewank,to%20quickly%20obtain%20(high%20dimensional">https://arxiv.org/abs/1809.08530#:~:text=The%20Cheap%20Gradient%20Principle%20(Griewank,to%20quickly%20obtain%20(high%20dimensional</a>)</span></span></span></p> <p><span><span><span>IBM. (2022). What is supercomputing technology? <a href="https://www.ibm.com/topics/supercomputing">https://www.ibm.com/topics/supercomputing</a></span></span></span></p> <p><span><span><span>Nadis, S. (7 Feb, 2022). A new programming language for high-performance computers. MIT News. <a href="https://news.mit.edu/2022/new-programming-language-high-performance-computers-0207">https://news.mit.edu/2022/new-programming-language-high-performance-computers-0207</a></span></span></span></p> <p><span><span><span>Ong, J., et. al. (2021). Protecting Big Data Privacy Using Randomized Tensor Network Decomposition and Dispersed Tensor Computation [Experiments, Analyses &amp; Benchmarks]. Arxiv. <a href="https://arxiv.org/pdf/2101.04194.pdf">https://arxiv.org/pdf/2101.04194.pdf</a></span></span></span></p> Categories: <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a>, <a href="/taxonomy/term/39" hreflang="en">Cyber and Information Security</a>, <a href="/blog/category/astronautical-engineering" hreflang="en">Astronautical Engineering</a> <section id="section-34791" class="section background-white"> <div class="super-contained"> </div> </section> Mon, 07 Mar 2022 19:06:44 +0000 emdecker 8446 at Virtual Volunteer Opportunities – A Safe Way to Give Back /blog/virtual-volunteer-opportunities-safe-way-give-back Virtual Volunteer Opportunities – A Safe Way to Give Back <span><span lang about="/user/69196" typeof="schema:Person" property="schema:name" datatype>emdecker</span></span> <span><time datetime="2022-02-04T14:49:00-05:00" title="Friday, February 4, 2022 - 14:49">February 4, 2022</time><br><br> </span> <img loading="lazy" src="/sites/default/files/Person%20Smiling%20Behind%20Laptop.jpg" width="640" alt="Person Smiling Behind Laptop" typeof="foaf:Image"> <p><span><span><span>Looking for a fun, virtual way to volunteer? Try <a href="https://scistarter.org/">SciStarter</a>, a globally acclaimed community of citizen scientists affiliated with the State Universities of North Carolina and Arizona, and grant-supported by the National Science Foundation (NSF). Notable partnerships and project sponsorships include NASA, Discover Magazine, National Science Teachers Association (NSTA), and Girl Scouts of America (GSUSA).</span></span></span></p> <p><span><span><span>SciStarter is an interactive site that allows community members to participate in volunteer opportunities that may otherwise be inaccessible to them due to location, time, COVID restrictions, and other factors. What is great about citizen science is that anyone who is interested can participate and contribute to high-quality data-driven results – whether you are a beginner, professional, or just generally interested in the field. The diversity of participants adds a greater perspective and farther reach for data collection, creating well-rounded research that can be used by a team of professionals. Participation also benefits the volunteer, offering educational experiences, networking opportunities, and ways to get involved in the betterment of your community.</span></span></span></p> <p><span><span><span>SciStarter has something for everyone – pick an interest through the “Project Finder” tab and sign up to help contribute to important research in that field. There are opportunities related to every degree program offered at Capitol Tech – cybersecurity, astronomy, computer science, construction – you name it, they got it! And most projects can be completed right from your home or neighborhood.</span></span></span></p> <p><span><span><span>One notable project for <a href="/fields-of-study/cyber-and-information-security">cybersecurity</a> fans is the <a href="https://scistarter.org/nova-cybersecurity-lab">NOVA Cybersecurity Lab</a>. This project is sponsored by the Public Broadcasting Service (PBS) NOVA Science Series and Lockheed Martin. To conduct research for this project, you will play a series of fun, interactive games to help defend a company against cyberattacks. Crack passwords, determine risks, prioritize assets, decode programs, and compete in cyber battles for this project. Your participation will help inform research and education solutions for future cyber efforts, while you gain an understanding of basic cyber threat scenarios.</span></span></span></p> <p><span><span><span>Another project, aimed towards <a href="/degrees-and-programs/bachelors-degrees/astronautical-engineering-bs">astronautical engineer</a> volunteers, is <a href="https://scistarter.org/exoplanet-research-workshop">NASA’s Exoplanet Research Workshop</a>. This project is meant for college students who want to help observe transiting planets that lay outside of our solar system. By volunteering, participants will help with the predication accuracy of transit events with large telescopes, like Hubble and the new <a href="/news-events/james-webb-space-telescope-launch-dec-24">James Webb Space Telescope</a>, and even potentially discover new exoplanets using transiting timing variations to infer their existence. Many Capitol Tech graduates have contributed to missions like the Webb Telescope launch, and Capitol Tech is also breaking ground with a new <a href="/news-events/observatory-coming-capitol-campus">ALPHA Observatory</a>, so volunteering with these types of projects can help further develop your skillset in the classroom and in the field.</span></span></span></p> <p><span><span><span>Volunteer opportunities like these can help you gain experience, network with peers, and <a href="https://mycapitol.captechu.edu/ICS/College_Offices/Career_Services/">add depth to your resume</a>. Consider volunteering today!</span></span></span></p> <p><span><span><span>For more information on building your resume and enhancing your career through volunteer or job opportunities, contact Capitol Tech <a href="https://mycapitol.captechu.edu/ICS/College_Offices/Career_Services/">Career Services</a>.</span></span></span></p> <figure role="group" class="align-center"> <div alt="SciStarter Logo Robot with Bird Inside Gear" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="b82814a0-5b6a-477f-afd0-3b466468205b" title="SciStarter Logo Robot with Bird Inside Gear" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/SciStarter%20Logo%20Robot%20with%20Bird%20Inside%20a%20Gear.jpg" alt="SciStarter Logo Robot with Bird Inside Gear" title="SciStarter Logo Robot with Bird Inside Gear" typeof="foaf:Image"> </div> <figcaption>Photo Credit: SciStarter.org</figcaption> </figure> <p>&nbsp;</p> <p><em>Written by Erica Decker&nbsp;</em></p> Categories: <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-34336" class="section background-white"> <div class="super-contained"> </div> </section> Fri, 04 Feb 2022 19:49:00 +0000 emdecker 8301 at Public Radio Broadcasting Day - A Glimpse at Capitol History /blog/public-radio-broadcasting-day-glimpse-capitol-history Public Radio Broadcasting Day - A Glimpse at Capitol History <span><span lang about="/user/69196" typeof="schema:Person" property="schema:name" datatype>emdecker</span></span> <span><time datetime="2022-01-13T14:58:17-05:00" title="Thursday, January 13, 2022 - 14:58">January 13, 2022</time><br><br> </span> <img loading="lazy" src="/sites/default/files/Capitol%20Technology%20University%20Campus_1.png" width="640" alt="Capitol Technology University Campus" typeof="foaf:Image"> <figure role="group" class="align-center"> <div alt="Capitol Radio Engineering Institute CREI 1927" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="e7c730e8-7ab0-4000-9fc9-6e09144b82a5" title="Capitol Radio Engineering Institute (CREI) 1927" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/Capitol%20Radio%20Engineering%20Institute%20CREI%201927.jpg" alt="Capitol Radio Engineering Institute CREI 1927" title="Capitol Radio Engineering Institute (CREI) 1927" typeof="foaf:Image"> </div> <figcaption>Capitol Radio Engineering Institute (CREI) 1927</figcaption> </figure> <p>&nbsp;</p> <p><span><span><span>January 13<sup>th</sup> marks the celebration of Public Radio Broadcasting Day, a day honoring the invention of the radio and the significant role it has played throughout history. Many people helped make the radio possible, contributing to its creation and evolution over the decades. Guglielmo Marconi is credited as the inventor of the first radio to transmit long distance morse code signals across the Atlantic Ocean in 1901, although Nikolai Tesla was also working on radio technology at the time and is often argued to be the true inventor of the radio. Tesla’s studies started in 1886 when he proved the existence of radio waves and this led to the invention of his Tesla coil, which Marconi would later use in his radio demonstration. Lee de Forest is considered the “Father of Radio” as he expanded on Marconi’s radio device to produce faster, more reliable signals, and transmitted the first public broadcast in 1910. Advancements in radio technology would eventually lead to the invention of many other devices like televisions, satellites, cell phones, and the internet. The radio and its ability to broadcast has had lasting impacts on global communication, culture, and society as a whole.</span></span></span></p> <p><span><span><span>Some may not realize that Capitol Technology University has a strong connection with the radio, as the University was founded by a U.S. Navy Radioman named Eugene H. Rietzke. In his lifetime (during World War I and in the early 1900s), radios were initially used as a way for military forces to communicate with each other, rather than as a means of entertainment. Radio technology was still rudimentary despite a desperate need for reliable communications systems during wartime. Limitations of radio use were due to issues with shorter waves of frequency, cumbersome and heavy equipment, transmission and signal unreliability, and personnel training.</span></span></span></p> <p><span><span><span>Rietzke recognized the need for better radio technology and education. Thus, he founded <a href="/blog/capitol-technology-universitys-recounts-its-history-radio-engineering-national-radio-day">Capitol Radio Engineering Institute (CREI)</a> in 1927, which would later become Capitol Technology University. It began as a trade school, which is a technical school designed to train students in a specific trade career. <a href="/about-capitol/capitol-history">When he started his school</a>, he only had 40 students, but he worked tirelessly to provide a curriculum, hands-on experience, and even wrote his own textbook to address the shortage of a specialized workforce in the field of radio. His students were able to learn from a field professional and study in laboratories with real equipment like vacuum tube radios, analog computers, and other electronics of the time. By the start of WWII, radio technology had vastly improved and CREI was ready with not 40, but now 3,000 well-trained technicians prepared for this new era of radio. And Capitol has grown ever since.</span></span></span></p> <p><span><span><span>Although starting from small beginnings and with limited resources, Rietzke saw a need for advancement and strove for better. Rietzke once said “if you have imagination and if you have the willingness…I don’t see how you could fail,” (Jarrell, 2002). This ideal remains a part of Capitol today as students are always encouraged to find new ways to approach society’s needs, explore evolving technology, acquire hands-on laboratory experience, and learn from field professionals through the University’s many <a href="/degrees-and-programs">undergraduate and graduate programs</a>.</span></span></span></p> <p><span><span><span>For more information on Capitol history, visit the University’s website <a href="/about-capitol/capitol-history">here</a>.</span></span></span></p> <p>&nbsp;</p> <figure role="group"> <div alt="CREI Residence Students in Vacuum Tube Radio Lab" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="729eebb9-67f2-41a7-9a00-47cbe684cb43" title="CREI Residence Students in Vacuum Tube Radio Lab" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/CREI%20Residence%20Students%20in%20Vacuum%20Tube%20Radio%20Lab.jpg" alt="CREI Residence Students in Vacuum Tube Radio Lab" title="CREI Residence Students in Vacuum Tube Radio Lab" typeof="foaf:Image"> </div> <figcaption>CREI Residence Students in Vacuum Tube Radio Lab</figcaption> </figure> <p>&nbsp;</p> <p>&nbsp;</p> <p><span><span><span>References:</span></span></span></p> <p><span><span><span><em>Golden Age of Radio in the US.</em> (2022). Digital Public Library of America. Retrieved from <a href="https://dp.la/exhibitions/radio-golden-age/radio-frontlines">https://dp.la/exhibitions/radio-golden-age/radio-frontlines</a></span></span></span></p> <p><span><span><span>Jarrell, H. J. (2002). <em>The Evolution of Capitol College: An Oral History</em>. Capitol College.</span></span></span></p> <p><span><span><span>Smith, J. Y. (1983). <em>Eugene H. Rietzke, 85, Dies</em>. Washington Post. Retrieved from <a href="https://www.washingtonpost.com/archive/local/1983/01/05/eugene-h-rietzke-85-dies/bd1ee86d-56db-408a-ab2d-6362ed271c7d/">https://www.washingtonpost.com/archive/local/1983/01/05/eugene-h-rietzke-85-dies/bd1ee86d-56db-408a-ab2d-6362ed271c7d/</a></span></span></span></p> <p><span><span><span><em>Birth of public radio broadcasting</em>. (2022, Jan 11). In <em>Wikipedia</em>. Retrieved from <a href="https://en.wikipedia.org/wiki/Birth_of_public_radio_broadcasting">https://en.wikipedia.org/wiki/Birth_of_public_radio_broadcasting</a></span></span></span></p> <p>&nbsp;</p> <p><span><span><span>Photo Credits: </span></span></span></p> <p><span><span><span>Jarrell, H. J. (2002). <em>The Evolution of Capitol College: An Oral History</em>. Capitol College.</span></span></span></p> Categories: <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a>, <a href="/taxonomy/term/42" hreflang="en">Engineering Technologies</a> <section id="section-33866" class="section background-white"> <div class="super-contained"> </div> </section> Thu, 13 Jan 2022 19:58:17 +0000 emdecker 8241 at Recent Breakthroughs in Quantum AI /blog/recent-breakthroughs-quantum-ai Recent Breakthroughs in Quantum AI <span><span lang about="/user/67246" typeof="schema:Person" property="schema:name" datatype>amschubert</span></span> <span><time datetime="2021-11-17T23:14:12-05:00" title="Wednesday, November 17, 2021 - 23:14">November 17, 2021</time><br><br> </span> <img loading="lazy" src="/sites/default/files/pexels-pixabay-373543.jpg" width="640" alt="Quantum artificial intelligence" typeof="foaf:Image"> <p><span><span><span>Artificial intelligence (AI) is a technological breakthrough that has changed the way we live our daily lives. It has brought us technology we often take for granted—such as smart home devices—to tackling large world problems like climate change. Like many technologies, however, classical AI techniques are reaching their limits in terms of computing power. In the never-ending search for bigger, better, faster—enter Quantum AI.</span></span></span></p> <p><span><span><span>Quantum AI uses quantum computing to improve computational tasks within AI and other related fields, such as machine learning (ML) and natural language processing (NLP). While seeming to be an ideal solution for the existing issues with classical AI, there have been some concern over “barren plateaus,” which occurs when optimization problems turn flat resulting in no clear path to a solution.</span></span></span></p> <p><span><span><span>Recent research from </span><a href="https://discover.lanl.gov/news/releases/1015-quantum-ai">Los Alamos National Laboratory (LANL)</a><span>, Absence of Barren Plateaus in Quantum Convolutional Neural Networks, published in </span><a href="https://journals.aps.org/prx/abstract/10.1103/PhysRevX.11.041011"><em>Physical Review X</em></a><span>, offered good news regarding Quantum AI and the risk of a barren plateau.</span></span></span></p> <p><span><span><span>“The Los Alamos work shows how some quantum neural networks are, in fact, immune to barren plateaus,” says </span><a href="https://www.sciencedaily.com/releases/2021/10/211018154236.htm">ScienceDaily</a><span> on the LANL release. “The Los Alamos team developed a novel graphical approach for analyzing the scaling within a quantum neural network and proving its trainability.”</span></span></span></p> <p><span><span><span>The key to LANL’s solution is the construction of the quantum neural network, with some being “immune” to barren plateaus explains Marco Cerezo, co-author of the LANL paper.</span></span></span></p> <p><span><span><span>“We proved the absence of barren plateaus for a special type of quantum neural network,” says Cerezo. “Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters.”</span></span></span></p> <p><span><span><span>The network proposed by LANL could be used by a variety of researchers attempting to solve the latest technological and scientific problems.</span></span></span></p> <p><span><span><span>“With this guarantee in hand, researchers will now be able to sift through quantum-computer data about quantum systems and use that information for studying material properties or discovering new materials, among other applications,” said Patrick Coles, the paper’s other coauthor.</span></span></span></p> <p><span><span><span>Additional breakthroughs in the field of Quantum AI were shared by IBM in July. The company announced they have developed a quantum kernel algorithm for a specific class of classification problems that is reproducibly faster than classical ML algorithms.</span></span></span></p> <p><span><span><span>Published in </span><a href="https://www.nature.com/articles/s41567-021-01287-z"><em>Nature Physics</em></a><span>, the paper</span> written by <span>Yunchao Liu from University of California, Berkeley and IBM research intern, alongside two IBM coauthors, describes the solution as “a rigorous and robust quantum speed-up.”</span></span></span></p> <p><span><span>IBM’s algorithm uses an existing and proven machine learning model to trained on a quantum kernel method that can be efficiently solved in far less time than it would take with a classical method.</span></span></p> <p><span><span>“Its quantum advantage comes from the fact that we can construct a family of datasets for which only quantum computers can recognize the intrinsic labeling patterns, while for classical computers the dataset looks like random noise,” shares <a href="https://research.ibm.com/blog/quantum-kernels">IBM</a>.</span></span></p> <p><span><span>Quantum AI is still in its infancy in the world of AI and ML. As researchers spend more time developing the technology even more breakthroughs are bound to be discovered – potentially providing the answers to questions that haven’t yet been asked.</span></span></p> <p><span><span><span>Capitol Tech offers bachelor’s, master’s, and PhD programs in </span><a href="/fields-of-study/computer-science-artificial-intelligence-and-data-science">cyber analytics and data science</a><span>, including courses in machine learning, artificial intelligence, and natural language processing. Email </span><a href="mailto:admissions@captech.edu">admissions@captech.edu</a><span> for more information.</span></span></span></p> Categories: <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-32751" class="section background-white"> <div class="super-contained"> </div> </section> Thu, 18 Nov 2021 04:14:12 +0000 amschubert 8031 at Email Spoofing: What is it, how does it work, and how do we prevent it?  /blog/email-spoofing-what-it-how-does-it-work-and-how-do-we-prevent-it Email Spoofing: What is it, how does it work, and how do we prevent it?&nbsp; <span><span lang about="/user/67246" typeof="schema:Person" property="schema:name" datatype>amschubert</span></span> <span><time datetime="2021-09-27T13:23:20-04:00" title="Monday, September 27, 2021 - 13:23">September 27, 2021</time><br><br> </span> <img loading="lazy" src="/sites/default/files/email%20spoofing.jpg" width="480" alt="email spoofing definition and prevention" typeof="foaf:Image"> <p>Email phishing is not a new concept. Since email became&nbsp;a&nbsp;widely-used&nbsp;method of communication, unscrupulous individuals have been sending emails for fake lotteries, invented inheritances, and fake charitable schemes. Many of these types of scams were easy to identify by looking at the sender. However, as technology has advanced, so have phishing schemes—especially in the form of email spoofing.&nbsp;</p> <p><strong>What is it?&nbsp;</strong></p> <p>Email spoofing occurs when a sender “masks” the sender on an email so that it looks to the recipient like the message has come from someone they know and trust.&nbsp;&nbsp;</p> <p>“When an email is sent, the From address doesn't show which server the email was actually sent from,” reports Hacker News in an&nbsp;<a href="https://thehackernews.com/2021/03/how-to-effectively-prevent-email.html" target="_blank">article on email spoofing</a>.&nbsp;“Instead, it shows the domain that was entered when the address was created so as not to arouse suspicion among recipients.”&nbsp;</p> <p>That results in the recipient receiving an email from AuntJane@domain.com—matching the one that exists in your contacts—that isn’t&nbsp;actually your&nbsp;aunt at all.&nbsp;&nbsp;</p> <p><strong>How does it work?&nbsp;</strong></p> <p>Email spoofing&nbsp;is able to&nbsp;occur because of&nbsp;how emails are handled by client applications and email servers, says Hacker News.&nbsp;</p> <p>“Outbound email servers have no way of knowing if the sender address is legitimate or spoofed,” says the article. “Therefore, email spoofing is possible because the email system used to represent email addresses provides no way for outbound servers to verify the legitimacy of the sender's address.”&nbsp;</p> <p>This means that malicious&nbsp;users can write scripts to reconfigure some email applications to display the address of one user when it is sent by another. Hacker News reports that this level of script use is not an advanced skill, meaning that it can be used by more people even if they don’t have an expansive knowledge of coding.&nbsp;</p> <p><strong>How do we prevent it?&nbsp;</strong></p> <p>The threat of email spoofing, and other forms of phishing attacks, cost individuals and businesses vast amounts of money. And the number of email fraud attacks only continues to increase. Hacker News reported that in 2020, phishing attacks increased 220% during the peak of the global pandemic when compared to the prior year.&nbsp;</p> <p>So&nbsp;what can be done to prevent this form of attack? Hacker News suggests implementing&nbsp;Domain-Based Message Authentication, Reporting, and Conformance&nbsp;(DMARC), an email authentication protocol.&nbsp;</p> <p>“DMARC works with two standard authentication practices - SPF and DKIM - to authenticate outbound messages and provides a way to tell receiving servers how to respond to emails that fail authentication checks,” says Hacker News.&nbsp;</p> <p>This strategy effectively blocks the unauthorized email from reaching the recipient, greatly reducing the number of spoofed messages from being sent successfully.&nbsp;</p> <p>Cybersecurity experts need to stay on top of the latest trends in the industry&nbsp;in order to&nbsp;know how best to prevent them.&nbsp;Capitol Tech offers bachelor’s, master’s and doctorate degrees in&nbsp;<a href="/fields-of-study/cyber-and-information-security" target="_blank">cyber and information security</a>&nbsp;with&nbsp;coursework focused on the latest techniques in fighting&nbsp;cyber attacks.&nbsp;&nbsp;</p> <p>Many courses are available both on campus and online.&nbsp;To learn more about Capitol&nbsp;Tech’s degree programs,&nbsp;contact&nbsp;<a href="mailto:admissions@captechu.edu" target="_blank">admissions@captechu.edu</a>.&nbsp;</p> Categories: <a href="/taxonomy/term/39" hreflang="en">Cyber and Information Security</a>, <a href="/taxonomy/term/38" hreflang="en">Computer Science, Artificial Intelligence and Data Science</a> <section id="section-31971" class="section background-white"> <div class="super-contained"> </div> </section> Mon, 27 Sep 2021 17:23:20 +0000 amschubert 7856 at