Uncrewed Systems / en Eyes to the Universe | ALPHA Teaser Series #3 /blog/eyes-universe-alpha-teaser-series-3 Eyes to the Universe | ALPHA Teaser Series #3 <span><span lang about="/user/69196" typeof="schema:Person" property="schema:name" datatype>emdecker</span></span> <span><time datetime="2022-02-25T12:40:47-05:00" title="Friday, February 25, 2022 - 12:40">February 25, 2022</time><br><br> </span> <img loading="lazy" src="/sites/default/files/ALPHA%20M81%20with%20Calibration%20data%20added%20with%20artifacts%20removed.jpg" width="640" alt="M81 image with Calibration data added with artifacts removed" typeof="foaf:Image"> <p><span><span><span>When you imagine an observer using a telescope, you might think they use a conical eyepiece to obtain a zoomed view of the target of interest. However, since the human eye is only sensitive to objects with stellar magnitudes that are 6.5 or lower, a camera would need to be used to obtain such views. For instance, the brightest star in the night sky is Sirius, in the constellation Canis Major, at magnitude -1.46. On the visual magnitude scale, the lower the number, the brighter the object, while higher numbers represent dimmer objects. The dimmest star in the same constellation is SY Canis Major at magnitude 9.5. Thus, to detect fainter objects such as galaxies and asteroids, <a href="/news-events/observatory-coming-capitol-campus">the ALPHA observatory</a> will instead use a device called a “dedicated CMOS astronomy camera”. This type of device is similar to the cameras in our cell phones but geared towards astronomy applications. This week, the ALPHA observatory ZWO 1600MM cooled monochrome camera arrived and we will begin bench-testing in two weeks. On this teaser episode of the observatory, let’s discuss ALPHA’s “eyes to the universe”.&nbsp; </span></span></span></p> <figure role="group" class="align-left"> <div alt="Parts received so far for the new ALPHA telescope and Observatory – ASI 1600MM Pro Lens and Hyper Star Lens" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;medium&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="129eb64d-366d-49ed-b261-c24491305ea6" title="ALPHA Parts - Astro Camera and Lens" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/styles/medium/public/ALPHA%20Parts%20Received%20-%20ASI%201600MM%20Pro%20Lens%20Camera%20and%20HyperStar%20Lens.jpg?itok=oBU3PCRj" alt="Parts received so far for the new ALPHA telescope and Observatory – ASI 1600MM Pro Lens and Hyper Star Lens" title="ALPHA Parts - Astro Camera and Lens" typeof="foaf:Image"> </div> <figcaption>Above: Parts received so far for the new ALPHA telescope and Observatory – ZWO 1600MM Pro Cooled CMOS Camera and HyperStar Lens.</figcaption> </figure> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <figure role="group" class="align-left"> <div alt="Professor Mabson demonstrates how camera and lens will be attached to the ALPHA telescope." data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;large&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="547f9561-20d6-45a2-8155-591197cffa6b" title="Professor Mabson with Telescope Astro ALPHA" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/styles/large/public/Professor%20Mabson%20demonstrates%20ALPHA%20lens%20attachments.jpg?itok=DJDZyB1M" alt="Professor Mabson demonstrates how camera and lens will be attached to the ALPHA telescope." title="Professor Mabson with Telescope Astro ALPHA" typeof="foaf:Image"> </div> <figcaption>Capitol Tech Astronautical Engineering Professor Dr. Marcel Mabson demonstrates how camera and lens will be attached to the ALPHA telescope.</figcaption> </figure> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p><span><span><span>The ZWO 1600MM camera will be ALPHA’s “eyes to the universe” – this camera will enable operators to view objects as if they were looking through an eyepiece. One of the benefits of a CMOS camera is its ability to detect fainter objects. To assist with the detection of the object, the camera has a built-in cooler, which enables the CMOS chip to be cooled by -25 degrees, the current ambient temperature. The ability to cool the camera is essential – the colder you can make a device, the less thermal noise will be in the images. Due to faint objects, we need to capture multiple images and perform “stacking”. This procedure increases the signal/ratio and aids in pulling out data on the object. The image below demonstrates this. NGC7635, or by its famous name, the “bubble nebula”, is located in the constellation Cassiopeia. The first image is a 120-second image, the second is a 10-minute image, and the third is a 2.5-hour image. As we can see, the more data that is collected, the more detail that can be extracted from the object and less noise will be present in the images.&nbsp; </span></span></span></p> <figure role="group" class="align-left"> <div alt="Example of Stacking Photo Method using CMOS Camera" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;large&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="81b562ad-aecf-42f1-90bb-1bdf58b0d6cc" title="ALPHA Example of Stacking Photos using CMOS Camera Astro" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/styles/large/public/Example%20of%20Stacking%20Photos%20using%20CMOS%20Camera%20ALPHA.png?itok=jsiUplLd" alt="Example of Stacking Photo Method using CMOS Camera" title="ALPHA Example of Stacking Photos using CMOS Camera Astro" typeof="foaf:Image"> </div> <figcaption>(Three images demonstrating “stacking” method of cooled CMOS camera to detect faint celestial objects. Left: 120-minute exposure, Center: 10-minute exposure, Right: 2.5-hour exposure)</figcaption> </figure> <p>&nbsp;</p> <p><span><span><span>Calibration Data:</span></span></span></p> <p><span><span><span>While we are able to cool the camera to lower its thermal noise, we cannot eliminate all thermal noise from the device. To remove unwanted noise from the images, we create calibration data called “dark and bias” frames. A dark calibration frame allows us to create an image that contains remaining thermal noise at a given exposure and temperature. This dark frame is then subtracted from the image. In addition, a calibration frame called a “bias” frame is created. As the camera reads data from each pixel, they will introduce “read” noise. This read noise creates a pattern of various “shades” that can appear on the images. The final calibration data we create is a “flat” frame. As data is taken from the camera, imperfections can be detected, not from the camera device but from the telescope itself. Dust, hair, and other interfering artifacts can appear on the corrector plate of the telescope. At the beginning or end of an imaging session, we create a flat frame that captures the state of the main lens and we can remove any artifacts from the images. Below is an example, from M81 with and without calibration. Notice the image without any calibration added. It contains two small rings – this is dust that has collected on the optics lens and the image now contains bias noise (brighter areas around the center and upper right parts of the image), while the second image shows calibration with all artifacts removed. </span></span></span></p> <figure role="group" class="align-left"> <div alt="M81 Images with and without calibration data added" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;large&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="2c1080b8-efc7-4b2d-bb60-f5eb3f4e71e7" title="ALPHA Astro Images M81 with / without Calibration Data" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/styles/large/public/ALPHA%20M81%20with%20and%20without%20Calibration%20data.jpg?itok=kwo80N7n" alt="M81 Images with and without calibration data added" title="ALPHA Astro Images M81 with / without Calibration Data" typeof="foaf:Image"> </div> <figcaption>(Left: M81 without Calibration data with artifacts intact, Right: M81 with Calibration data added with artifacts removed)</figcaption> </figure> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p><span><span><span>Why Monochrome?: </span></span></span></p> <p><span><span><span>All images generated by ALPHA will be in monochrome (black and white). There are various camera configurations that allow a user to generate a color image – however, they have a drawback. ALPHA’s goal is to detect faint asteroid targets. In order for a color to be generated by the camera, each pixel of the device would have to be divided into a debayer matrix. This means that some pixels will have a green filter, and others will have red and blue. This, in turn, lowers the sensitivity of the camera. To allow maximum sensitivity, ALPHA will use a monochrome camera, which allows each pixel to use 100% of its area to capture light and maximize image captures and q</span></span></span><span><span><span>uality.</span></span></span></p> <figure role="group" class="align-left"> <div alt="M33 Triangulum Galaxy" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;large&quot;,&quot;image_link&quot;:&quot;&quot;}" data-entity-type="media" data-entity-uuid="eeaa73fa-5b1c-4930-bc8c-42e0e544cae5" title="ALPHA Astro M33 Triangulum Galaxy" data-langcode="en" class="embedded-entity"> <img loading="lazy" src="/sites/default/files/styles/large/public/ALPHA%20M33%20Triangulum%20Galaxy.jpg?itok=s8ybqNh8" alt="M33 Triangulum Galaxy" title="ALPHA Astro M33 Triangulum Galaxy" typeof="foaf:Image"> </div> <figcaption>Above: Image of M33 Triangulum Galaxy</figcaption> </figure> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p>&nbsp;</p> <p><span><span><span>Asteroid detection: </span></span></span></p> <p><span><span><span>ALPHA’s primary goal will be to detect and perform follow-up studies of asteroids and comets. To perform this, ALPHA will be configured in what is known as “F2 configuration”. The F ratio describes the total magnification of the optical system. In its native F10 configuration, ALPHA’s telescope will have over 2800mm of focal length available. While this may seem advantageous for the telescope, a major drawback with high focal length telescopes is the time required to collect enough light for a given object. Many of ALPHA’s objects are fast moving – thus, if an exposure is long enough, the target of interest would form a “streak” or line across the image. Another drawback of this configuration is the field of view (FOV) is small and thus, the telescope would need to be able to move with the object or else it will move out of the telescope’s view. To resolve this, ALPHA will use the Hyperstar system. The Hyperstar system replaces the telescopes secondary memory and transforms the telescope from a native F10, 2800mm system to a F2, 560mm system. This provides ALPHA with a FOV of ~3 degrees, meaning that in what a F10 system could do in 1 hour, the F2 configuration the system can accomplish in ~25 minutes. ALPHA will remain in the F2 configuration for the majority of its time during high lunar phases, and will switch to a F6.3 configuration for student projects and community outreach events. </span></span></span></p> <p><span><span><span>Visit our <a href="/degrees-and-programs/bachelors-degrees/astronautical-engineering-bs">website</a> to learn more about <a href="/degrees-and-programs/bachelors-degrees/astronautical-engineering-bs">astronautical engineering</a> and Capitol Tech’s other <a href="/fields-of-study/aviation-and-unmanned-systems">aviation programs</a>. Many of our courses are available both on-campus and online. For more information, contact <a href="mailto:admissions@captechu.edu"> The Capitol Tech Admissions Team </a>.</span></span></span></p> Categories: <a href="/taxonomy/term/36" hreflang="en">Uncrewed Systems</a>, <a href="/blog/category/astronautical-engineering" hreflang="en">Astronautical Engineering</a>, <a href="/taxonomy/term/42" hreflang="en">Engineering Technologies</a> <section id="section-34731" class="section background-white"> <div class="super-contained"> </div> </section> Fri, 25 Feb 2022 17:40:47 +0000 emdecker 8406 at Robots Played Pivotal Role at the 2020 Tokyo Olympics and Paralympics  /blog/robots-played-pivotal-role-2020-tokyo-olympics-and-paralympics Robots Played Pivotal Role at the 2020 Tokyo Olympics and Paralympics&nbsp; <span><span lang about="/user/67246" typeof="schema:Person" property="schema:name" datatype>amschubert</span></span> <span><time datetime="2021-10-07T11:47:46-04:00" title="Thursday, October 7, 2021 - 11:47">October 7, 2021</time><br><br> </span> <img loading="lazy" src="/sites/default/files/robotsolympics.jpg" width="354" alt="robots in 2020 tokyo olympics" typeof="foaf:Image"> <p>There were many moments at the 2020 Tokyo Olympics that made the games unique. Though scheduled for 2020 they were delayed until the summer of 2021 due to the global pandemic and then had no audience beyond coaches and teammates in attendance. Another element that made these Olympics unique was the use of robots.&nbsp;</p> <p>A number of&nbsp;robots&nbsp;and autonomous solutions&nbsp;debuted during the Olympics&nbsp;including&nbsp;robot ambassadors, field support robots, virtual mobility robots, and autonomous vehicles.&nbsp;</p> <p><strong>Robot Ambassadors and Mascots&nbsp;</strong></p> <p>The Olympic mascots are two robots,&nbsp;Miraitowa&nbsp;and&nbsp;Someity, which were developed by Toyota Motor Corporation.&nbsp;&nbsp;</p> <p>“Mascot-type robots will welcome athletes and guests at Games venues and other Games-related locations with human-like movements, such as shaking hands and waving, and with a variety of facial expressions,” shares the&nbsp;<a href="https://olympics.com/ioc/news/new-robots-unveiled-for-tokyo-2020-games" target="_blank">International Olympic Committee (IOC)</a>.&nbsp;&nbsp;</p> <p>The robots are equipped with cameras that enabled them to detect any time someone approached so that they could react and welcome that person to the Olympics.&nbsp;</p> <p><strong>Field Support Robots&nbsp;</strong></p> <p>Playing a large part in streamlining the games, field support robots act as mechanical retrievers,&nbsp;picking up various equipment that is utilized in track and field. The robots&nbsp;are able to&nbsp;determine the path of the items thrown, be it javelin, hammer, or other items, and guide staff along paths that avoid obstacles.&nbsp;</p> <p>“This will help reduce both the amount of time needed to retrieve items and the amount of human support required at events,” says the IOC. Limiting the number of volunteers needed for events is especially beneficial due to the restrictions of the pandemic.&nbsp;</p> <p><strong>Virtual Mobility Robots&nbsp;</strong></p> <p>Developed by Toyota Research Institute (TRI) in the United States, T-TR1 robots are basically rolling computer screens. The robots project images of individuals at various Olympic events to those at a different location.&nbsp;&nbsp;</p> <p>“With T-TR1, Toyota will give people that are physically unable to attend the events such as the Games a chance to virtually attend, with an on-screen presence capable of conversation between the two locations,” says a&nbsp;<a href="https://www.tri.global/news/toyota-introduces-tris-t-tr1-a-virtual-mobility-2019-7-22/" target="_blank">blog post by TRI</a>.&nbsp;&nbsp;</p> <p><strong>Autonomous Vehicles&nbsp;</strong></p> <p>The Toyota e-Palette autonomous vehicle was used to transport athletes and coaches from the Olympic villages to the various Olympic venues. The vehicle can run for 90 miles before needing to be re-charged.&nbsp;</p> <p>“A management system sends out vehicles to the locations in a pre-planned frequency, but automatically adjusts the schedule when passenger number build up,” Graeme Massie for&nbsp;<a href="https://www.independent.co.uk/sport/olympics/olympic-games-tokyo-robots-vehicles-b1884259.html" target="_blank">The Independent</a>.&nbsp;</p> <p>Toyota’s goal for&nbsp;all&nbsp;of&nbsp;the new&nbsp;robots was to show how&nbsp;the innovative creations&nbsp;can contribute to everyday life. The company&nbsp;even has plans to reuse&nbsp;certain robots, including the autonomous vehicles, as delivery lockers or mobile shops.&nbsp;</p> <p>Want to learn about unmanned systems, including robotics? Capitol Tech offers bachelor’s, master’s and doctorate degrees in&nbsp;<a href="/fields-of-study/aviation-and-unmanned-systems" target="_blank">unmanned systems</a>. Many courses are available both on campus and online. To learn more about Capitol Tech’s degree programs, contact&nbsp;<a href="mailto:admissions@captechu.edu" target="_blank">admissions@captechu.edu</a>.&nbsp;&nbsp;</p> <p><a href="https://wonderfulengineering.com/these-are-the-robots-showcasing-japanese-technology-at-the-tokyo-2020-olympics/"><em>Photo from Wonderful Engineering</em></a></p> Categories: <a href="/taxonomy/term/36" hreflang="en">Uncrewed Systems</a> <section id="section-32026" class="section background-white"> <div class="super-contained"> </div> </section> Thu, 07 Oct 2021 15:47:46 +0000 amschubert 7891 at Aerial Drone Spotter’s Guide /blog/unmanned-autonomous-systems-aerial-drone-guide Aerial Drone Spotter’s Guide <span><span lang typeof="schema:Person" property="schema:name" datatype>Anonymous (not verified)</span></span> <span><time datetime="2019-04-23T13:04:25-04:00" title="Tuesday, April 23, 2019 - 13:04">April 23, 2019</time><br><br> </span> <img loading="lazy" src="/sites/default/files/Lifesaver_social.jpg" width="640" alt="aerial lifeguard drone image" typeof="foaf:Image"> <p>Unmanned and autonomous systems, also called drones, are designed to accomplish distinctive jobs. Some are built with sensors or thermal imaging cameras, while others can carry packages. One thing is consistent; average salaries in the growing unmanned aircraft industry will range from $60,000 to $145,000.</p> <div alt="types of aerial drones used in saving lives infographic" data-embed-button="media_browser" data-entity-embed-display="media_image" data-entity-embed-display-settings="{&quot;image_style&quot;:&quot;&quot;,&quot;image_link&quot;:&quot;file&quot;,&quot;image_loading&quot;:{&quot;attribute&quot;:&quot;lazy&quot;}}" data-entity-type="media" data-entity-uuid="70e38631-1672-43f1-9679-4a79e36e1237" title="types of aerial drones used in saving lives infographic" class="align-center embedded-entity" data-langcode="en"> <a href="/sites/default/files/DroneSpotter_Infographic_FNL.jpg"><img loading="lazy" src="/sites/default/files/DroneSpotter_Infographic_FNL.jpg" alt="types of aerial drones used in saving lives infographic" title="types of aerial drones used in saving lives infographic" typeof="foaf:Image"> </a> </div> Categories: <a href="/taxonomy/term/36" hreflang="en">Uncrewed Systems</a> <section id="section-18131" class="section background-white"> <div class="super-contained"> </div> </section> Tue, 23 Apr 2019 17:04:25 +0000 Anonymous 4351 at