This Tuesday morning, House Democrats held a press conference to announce abuse of power and obstruction of Congress articles of impeachment against President Trump. Closing out the presser, House Intelligence Committee Chairman Adam Schiff argued that time is of the essence when it comes to holding Trump accountable.
"The argument 'why don't you just wait' amounts to this: Why don't you just let him cheat in one more election?" Schiff said.
"We stand here today because the president’s continuing abuse of his power has left us no choice. To do nothing wou… https://t.co/EjtkOPvg8Q
Schiff's words were met with praise, with many expressing thankfulness that he's at the helm of the Democrats' efforts to hold Trump accountable for what they see as his continuous abuse of power.
Thankfully House Democrats have Adam Schiff. That's why Trump and the GOP want to silence him. #ImpeachAndRemoveTrump
Video footage shows Florida sheriffs giving a security briefing to an armed right-wing group that was heading to the U.S. Capitol on Jan. 6, the Daily Dot reports.
The videos show the Homeland Security Division of the Flagler County, Florida Sheriff’s Office meeting with the The Flagler Liberty Coalition (FLC) and other pro-Trump protestors who were about to leave for Washington, D.C.
"The Flagler Liberty Coalition (FLC) recommended its members pack body armor, mace, and knives—which they said were for protection—and were working with Flagler County Commissioner Joe Mullins to bring crowds to D.C. that day. Mullins has faced criticism from his fellow local politicians for attending the protests that turned into the Capitol insurrection," the Dot's report stated. "Together, the group brought three buses of people to Washington on Jan. 6."
The Daily Dot's Eric Levai contends that the video is the best evidence yet of links between pro-Trump politicians, law enforcement, and right-wing groups in the lead up to the Capitol riot. The videos have since been deleted from YouTube.
The video was recorded by independent journalist Tracey Eaton and shows a sheriff warning the group that antifa will be at the Capitol and plans to use "fire" as a weapon. On the group's website, the intent to go to the Capitol that day was expressed openly. One person in the group warned that the government might fry their cellphones if they stormed the Capitol.
The briefing took place one day before the Capitol riot.
Flagler County Sheriff’s deputies Mike Lutz warned people in the video to travel in groups because they might be attacked by antifa if they're alone. In another video, FLC member Mark Phillips tells protestors to bring helmets, body armor, mace, pepper spray, and knives and says that some members of the group will be in “fight mode.”
"I want everyone coming back from this trip with a win," Lutz says. "We need to take our country back and we need to show up for our president.
None of the FLC members have been arrested or charged with any crimes related to Jan. 6.
Sitting outside on a summer evening always sounds relaxing until flies and mosquitoes arrive – then the swatting begins. Despite their minuscule eyes and a brain roughly 1 million times smaller than yours, flies can evade almost every swat.
Flies can thank their fast, sophisticated eyesight and some neural quirks for their ability to escape swats with such speed and agility.
Our labinvestigates insect flight and vision, with the goal of finding out how such tiny creatures can process visual information to perform challenging behaviors, such as escaping your swatter so quickly.
Faster vision
Flies have compound eyes. Rather than collecting light through a single lens that makes the whole image – the strategy of human eyes – flies form images built from multiple facets, lots of individual lenses that focus incoming light onto clusters of photoreceptors, the light-sensing cells in their eyes. Essentially, each facet produces an individual pixel of the fly’s vision.
A fly’s world is fairly low resolution, because small heads can house only a limited number of facets – usually hundreds to thousands – and there is no easy way to sharpen their blurry vision up to the millions of pixels people effectively see. But despite this coarse resolution, flies see and process fast movements very quickly.
Tiny hexagonal ‘facets’ take in light, and the photoreceptors beneath them process it in quick flashes.
We can infer how animals perceive fast movement from how quickly their photoreceptors can process light. Humans discern a maximum of about 60 discrete flashes of light per second. Any faster usually appears as steady light. The ability to see discrete flashes depends on the lighting conditions and which part of the retina you use.
Some LED lights, for example, emit discrete flashes of light quickly enough that they appear as steady light to humans – unless you turn your head. In your peripheral vision you may notice a flicker. That’s because your peripheral vision processes light more quickly, but at a lower resolution, like fly vision.
Remarkably, some flies can see as many as 250 flashes per second, around four times more flashes per second than people can perceive.
If you took one of these flies to the cineplex, the smooth movie you watched made up of 24 frames per second would, to the fly, appear as a series of static images, like a slide show. But this fast vision allows it to react quickly to prey, obstacles, competitors and your attempts at swatting.
Our research shows that flies in dim light lose some ability to see fast movements. This might sound like a good opportunity to swat them, but humans also lose their ability to see quick, sharp features in the dark. So you may be just as handicapped your target.
When they do fly in the dark, flies and mosquitoes fly erratically, with twisty flight paths to escape swats. They can also rely on nonvisual cues, such as information from small hairs on their body that sense changes in the air currents when you move to strike.
Flight of a mosquito. Source: Intellectual Ventures.
Neural tricks
But why do flies see more slowly in the dark? You may have noticed your own vision becoming sluggish and blurry in the dark, and much less colorful. The process is similar for insects. Low light means fewer photons, and just like cameras and telescopes, eyes depend on photons to make images.
But unlike a nice camera, which allows you to switch to a larger lens and gather more photons in dark settings, animals can’t swap out the optics of their eyes. Instead, they rely on summation, a neural strategy that adds together the inputs of neighboring pixels, or increases the time they sample photons, to form an image.
Big pixels and longer exposures capture more photons, but at the cost of sharp images. Summation is equivalent to taking photographs with grainy film (higher ISO) or slow shutter speeds, which produce blurrier images, but avoid underexposing your subjects. Flies, especially small ones, can’t see quickly in the dark because, in a sense, they are waiting for enough photons to arrive until they are sure of what they are seeing.
Flight maneuverability
In addition to rapidly perceiving looming threats, flies need to be able to fly away in a split second. This requires preparation for takeoff and quick flight maneuvers. After visually detecting a looming threat, fruit flies, for example, adjust their posture in one-fifth of a second before takeoff. Predatory flies, such as killer flies, coordinate their legs, wings and halteres – dumbbell-shaped remnants of wings used for sensing in-air rotations – to quickly catch their prey midflight.
Flight of a fly. Notice how they adjust their posture before takeoff. Source: The New York Times.
How best to swat a fly
To outmaneuver a fly, you must strike faster than it can detect your approaching hand. With practice, you may improve at this, but flies have honed their escapes over hundreds millions of years. So instead of swatting, using other ways to manage flies, such as installing fly traps and cleaning backyards, is a better bet.
Escape behavior of a fly in slow motion. Source: Florian Muijres et al, 2014 Science.
You can lure certain flies into a narrow neck bottle filled with apple cider vinegar and beer. Placing a funnel in the bottle neck makes it easy for them to enter, but difficult to escape.
Apple cider vinegar and beer trap to control fruit flies in your kitchen or backyard.
As for mosquitoes, some commercial repellents may work, but removing stagnant water around the house – in some plants, pots or any open containers – will help eliminate their egg-laying sites and reduce the number of mosquitoes around from the start. Avoid insecticides, as they also harm useful insects such as bees and butterflies.
We often think of astronomy as a visual science with beautiful images of the universe. However, astronomers use a wide range of analysis tools beyond images to understand nature at a deeper level.
Data sonification is the process of converting data into sound. It has powerful applications in research, education and outreach, and also enables blind and visually impaired communities to understand plots, images and other data.
Its use as a tool in science is still in its early stages – but astronomy groups are leading the way.
In a paper published in Nature Astronomy, my colleagues and I discuss the current state of data sonification in astronomy and other fields, provide an overview of 100 sound-based projects and explore its future directions.
The cocktail party effect
Imagine this scene: you’re at a crowded party that’s quite noisy. You don’t know anyone and they’re all speaking a language you can’t understand – not good. Then you hear bits of a conversation in a far corner in your language. You focus on it and head over to introduce yourself.
While you may have never experienced such a party, the thought of hearing a recognizable voice or language in a noisy room is familiar. The ability of the human ear and brain to filter out undesired sounds and retrieve desired sounds is called the “cocktail party effect”.
Similarly, science is always pushing the boundaries of what can be detected, which often requires extracting very faint signals from noisy data. In astronomy we often push to find the faintest, farthest or most fleeting of signals. Data sonification helps us to push these boundaries further.
The video below provides examples of how sonification can help researchers discern faint signals in data. It features the sonification of nine bursts from a repeating fast radio burst called FRB121102.
Casey Law/Youtube.
Fast radio bursts are millisecond bursts of radio emission that can be detected halfway across the universe. We don’t yet know what causes them. Detecting them in other wavelengths is the key to understanding their nature.
Too much of a good thing
When we explore the universe with telescopes, we find it’s full of cataclysmic explosions including the supernova deaths of stars, mergers of black holes and neutron stars that create gravitational waves, and fast radio bursts.
Here you can listen to the merger of two black holes.
LIGO/YouTube.
And the merger of two neutron stars.
LIGO/YouTube.
These events allow us to understand extreme physics at the highest-known energies and densities. They help us to measure the expansion rate of the universe and how much matter it contains, and to determine where and how the elements were created, among other things.
Upcoming facilities such as the Rubin Observatory and the Square Kilometer Array will detect tens of millions of these events each night. We employ computers and artificial intelligence to deal with these massive numbers of detections.
However, the majority of these events are faint bursts, and computers are only so good at finding them. A computer can pick out a faint burst if it’s given a template of the “desired” signal. But if signals depart from this expected behavior, they become lost.
And it’s often these very events that are the most interesting and yield the biggest insight into the nature of the universe. Using data sonification to verify these signals and identify outliers can be powerful.
More than meets the eye
Data sonification is useful for interpreting science because humans interpret audio information faster than visual information. Also, the ear can discern more pitch levels than the eye can discern levels of color (and over a wider range).
Another direction we’re exploring for data sonification is multi-dimensional data analysis – which involves understanding the relationships between many different features or properties in sound.
Plotting data in ten or more dimensions simultaneously is too complex, and interpreting it is too confusing. However, the same data can be comprehended much more easily through sonification.
As it turns out, the human ear can tell the difference between the sound of a trumpet and flute immediately, even if they play the same note (frequency) at the same loudness and duration.
Why? Because each sound includes higher-order harmonics that help determine the sound quality, or timbre. The different strengths of the higher-order harmonics enable the listener to quickly identify the instrument.
Now imagine placing information – different properties of data – as different strengths of higher-order harmonics. Each object studied would have a unique tone, or belong to a class of tones, depending on its overall properties.
With a bit of training, a person could almost instantly hear and recognise all of the object’s properties, or its classification, from a single tone.
Beyond research
Sonification also has great uses in education (Sonokids) and outreach (for example, SYSTEM Sounds and STRAUSS), and has widespread applications in areas including medicine, finance and more.
But perhaps its greatest power is to enable blind and visually impaired communities to understand images and plots to help with everyday life.
The ‘ticking’ noise that plays with the walk signal at traffic lights is one example of how sonification can assist blind and visually impaired people. Shutterstock
It can also enable meaningful scientific research, and do so quantitatively, as sonification research tools provide numerical values on command.
This capability can help promote STEM careers among blind and visually impaired people. And in doing so, we can tap into a massive pool of brilliant scientists and critical thinkers who may otherwise not have envisioned a path towards science.
What we need now is government and industry support in developing sonification tools further, to improve access and usability, and to help establish sonification standards.
With the growing number of tools available, and the growing need in research and the community, the future of data sonification sounds bright!