Placeholder Content Image

Machine learning gives us a dog’s-eye view, showing us how the brains of our best friends interpret the world

<p>Dog’s minds are being read! Sort of.</p> <p>Researchers have used fMRI (functional magnetic resonance imaging) scans of dogs’ brains and a machine learning tool to reconstruct what the pooch is seeing. The results suggest that dogs are more interested in what is happening than who or what is involved.</p> <p>The results of the experiment conducted at Emory University in Georgia in the US are <a href="https://dx.doi.org/10.3791/64442" target="_blank" rel="noreferrer noopener">published</a> in the <em>Journal of Visualized Experiments</em>.</p> <p>Two unrestrained dogs were shown three 30-minute videos. The fMRI neural data was recorded, and a machine-learning algorithm employed to analyse the patterns in the scans.</p> <p>“We showed that we can monitor the activity in a dog’s brain while it is watching a video and, to at least a limited degree, reconstruct what it is looking at,” says Gregory Berns, professor of psychology at Emory. “The fact that we are able to do that is remarkable.”</p> <p>Using fMRIs to study perception has recently been developed in humans and only a few other species including some primates.</p> <p>“While our work is based on just two dogs it offers proof of concept that these methods work on canines,” says lead author Erin Phillips, from Scotland’s University of St. Andrews who conducted the research as a specialist in Berns’s Canine Cognitive Neuroscience Lab. “I hope this paper helps pave the way for other researchers to apply these methods on dogs, as well as on other species, so we can get more data and bigger insights into how the minds of different animals work.”</p> <p>Machine learning, interestingly enough, is technology which aims to mimic the neural networks in our own brains by recognising patterns and analysing huge amounts of data.</p> <p>The technology “reads minds” by detecting patterns within the brain data which can be associated with what is playing in the video.</p> <p>Attaching a video recorder selfie stick placed at dog eye level, the researchers filmed relatable scenes for the canine audience.</p> <div class="newsletter-box"> <div id="wpcf7-f6-p214466-o1" class="wpcf7" dir="ltr" lang="en-US" role="form"> <form class="wpcf7-form mailchimp-ext-0.5.62 spai-bg-prepared init" action="/technology/machine-learning-dog-see/#wpcf7-f6-p214466-o1" method="post" novalidate="novalidate" data-status="init"> <p style="display: none !important;"><span class="wpcf7-form-control-wrap referer-page"><input class="wpcf7-form-control wpcf7-text referer-page" name="referer-page" type="hidden" value="https://cosmosmagazine.com/technology/" data-value="https://cosmosmagazine.com/technology/" aria-invalid="false" /></span></p> <p><!-- Chimpmail extension by Renzo Johnson --></form> </div> </div> <p>Recorded activities included dogs being petted by and receiving treats from people.</p> <p>Scenes with dogs showed them sniffing, playing, eating or walking. Other objects and animals included in the scenes included cars, bikes, scooters, cats and deer, as well as people sitting, hugging, kissing, offering a toy to the camera and eating.</p> <p><iframe src="https://players.brightcove.net/5483960636001/default_default/index.html?videoId=6312584526112" width="960" height="540" allowfullscreen="allowfullscreen"></iframe></p> <p>Time stamps on the videos helped classify them into objects (such as dog, car, human, cat) and actions (like sniffing, eating, walking).</p> <p>Only two dogs exhibited the patience to sit through the feature-length film. For comparison, two humans also underwent the same experiment. Both species, presumably, were coaxed with treats and belly pats.</p> <p>Machine-learning algorithm Ivis was applied to the data. Ivis was first trained on the human subjects and the model was 99% accurate in mapping the brain data onto both the object and action classifiers.</p> <p>In the case of the dogs, however, the model did not work for the object-based classifiers. It was, however, between 75 and 88% accurate in decoding the action classifiers in the dog fMRI scans.</p> <p>“We humans are very object oriented,” says Berns. “There are 10 times as many nouns as there are verbs in the English language because we have a particular obsession with naming objects. Dogs appear to be less concerned with who or what they are seeing and more concerned with the action itself.”</p> <p>Dogs see only in shades of blue and yellow but have a slightly higher density of vision receptors designed for detecting motion.</p> <p>“It makes perfect sense that dogs’ brains are going to be highly attuned to actions first and foremost,” Berns adds. “Animals have to be very concerned with things happening in their environment to avoid being eaten or to monitor animals they might want to hunt. Action and movement are paramount.”</p> <p>Philips believes understanding how animals perceive the world is important in her own research into how predator reintroduction in Mozambique may impact ecosystems.</p> <p>“Historically, there hasn’t been much overlap in computer science and ecology,” she says. “But machine learning is a growing field that is starting to find broader applications, including in ecology.”</p> <p><!-- Start of tracking content syndication. Please do not remove this section as it allows us to keep track of republished articles --></p> <p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=214466&amp;title=Machine+learning+gives+us+a+dog%E2%80%99s-eye+view%2C+showing+us+how+the+brains+of+our+best+friends+interpret+the+world" width="1" height="1" /></p> <p><!-- End of tracking content syndication --></p> <div id="contributors"> <p><em><a href="https://cosmosmagazine.com/technology/machine-learning-dog-see/" target="_blank" rel="noopener">This article</a> was originally published on <a href="https://cosmosmagazine.com" target="_blank" rel="noopener">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/evrim-yazgin" target="_blank" rel="noopener">Evrim Yazgin</a>. Evrim Yazgin has a Bachelor of Science majoring in mathematical physics and a Master of Science in physics, both from the University of Melbourne.</em></p> <p><em>Image: Emory Canine Cognitive Neuroscience Lab</em></p> </div>

Technology

Placeholder Content Image

Loneliness changes our brains

<p><span style="font-weight: 400;">Whether we are isolated due to COVID-19 lockdowns or any other reason, feeling lonely is a common response which can affect our brains.</span></p> <p><span style="font-weight: 400;">Research published in </span><em><a rel="noopener" href="https://www.nature.com/articles/s41467-020-20039-w" target="_blank"><span style="font-weight: 400;">Nature Communications</span></a></em><span style="font-weight: 400;"> has found that the brains of those who report feeling lonely look and respond differently to the brains of people who don’t.</span></p> <p><span style="font-weight: 400;">It seems like persistent feelings of isolation can affect the size of different areas of the brain, as well as how those areas communicate with the rest of the brain.</span></p> <p><span style="font-weight: 400;">The researchers examined the magnetic resonance imaging (MRI) data, genetics, and psychological self-assessments of approximately 40,000 middle-aged and older adults available in the UK Biobank: a database available to scientists around the world.</span></p> <p><span style="font-weight: 400;">In comparing the data of participants who reported feeling lonely against those who didn’t, scientists have found several differences in the brains of the lonely.</span></p> <p><span style="font-weight: 400;">These differences are centered around a set of brain regions called the default network. These regions are involved in reminiscing, planning the future, imagination, and thinking about others, and we use this network to remember the past, envision the future, and think about the hypothetical present.</span></p> <p><span style="font-weight: 400;">The brains of lonely participants were found to have default networks that were more strongly networked and contained a larger volume of grey matter.</span></p> <p><span style="font-weight: 400;">This may be due to lonely people being more likely to use their imagination, past memories, or future hopes to overcome their social isolation.</span></p> <p><span style="font-weight: 400;">“In the absence of desired social experiences, lonely individuals may be biased towards internally-directed thoughts such as reminiscing or imagining social experiences,” said lead author Nathan Spreng from the Neuro (Montreal Neurological Institute-Hospital) at Canada’s MacGill University.</span></p> <blockquote style="background: #FFF; border: 0; border-radius: 3px; box-shadow: 0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15); margin: 1px; max-width: 540px; min-width: 326px; padding: 0; width: calc(100% - 2px);" class="instagram-media" data-instgrm-captioned="" data-instgrm-permalink="https://www.instagram.com/p/CI0fP8wA5W6/?utm_source=ig_embed&amp;utm_campaign=loading" data-instgrm-version="13"> <div style="padding: 16px;"> <div style="display: flex; flex-direction: row; align-items: center;"> <div style="background-color: #f4f4f4; border-radius: 50%; flex-grow: 0; height: 40px; margin-right: 14px; width: 40px;"></div> <div style="display: flex; flex-direction: column; flex-grow: 1; justify-content: center;"> <div style="background-color: #f4f4f4; border-radius: 4px; flex-grow: 0; height: 14px; margin-bottom: 6px; width: 100px;"></div> <div style="background-color: #f4f4f4; border-radius: 4px; flex-grow: 0; height: 14px; width: 60px;"></div> </div> </div> <div style="padding: 19% 0;"></div> <div style="display: block; height: 50px; margin: 0 auto 12px; width: 50px;"></div> <div style="padding-top: 8px;"> <div style="color: #3897f0; font-family: Arial,sans-serif; font-size: 14px; font-style: normal; font-weight: 550; line-height: 18px;">View this post on Instagram</div> </div> <p style="color: #c9c8cd; font-family: Arial,sans-serif; font-size: 14px; line-height: 17px; margin-bottom: 0; margin-top: 8px; overflow: hidden; padding: 8px 0 7px; text-align: center; text-overflow: ellipsis; white-space: nowrap;"><a style="color: #c9c8cd; font-family: Arial,sans-serif; font-size: 14px; font-style: normal; font-weight: normal; line-height: 17px; text-decoration: none;" rel="noopener" href="https://www.instagram.com/p/CI0fP8wA5W6/?utm_source=ig_embed&amp;utm_campaign=loading" target="_blank">A post shared by The Neuro (@theneuro_mni)</a></p> </div> </blockquote> <p><span style="font-weight: 400;">Loneliness has been identified as a growing health problem, with previous studies showing that older people experiencing loneliness have a higher risk of cognitive decline and dementia.</span></p> <p><span style="font-weight: 400;">“We are just beginning to understand the impact of loneliness on the brain,” said Danilo Bzdok, a researcher at the Neuro and the study’s senior author. </span></p> <p><span style="font-weight: 400;">“Expanding our knowledge in this area will help us better appreciate the urgency of reducing loneliness in today’s society.”</span></p>

Mind

Placeholder Content Image

Man killed after being sucked into MRI machine

<p>A young Indian man has been killed after being sucked into an MRI machine while visiting a sick relative in hospital.</p> <p>Rajesh Maru, 32, was pulled into the machine by its magnetic force after entering the room while carrying an oxygen cylinder.</p> <p><a href="https://www.ndtv.com/mumbai-news/man-carries-oxygen-cylinder-into-mri-room-gets-sucked-into-machine-dies-1805445" target="_blank"><strong><span style="text-decoration: underline;">NDTV</span></strong></a> reports Maru’s hand became trapped in the machine after the damaged cylinder burst, “triggering a massive oxygen leak”. He was rushed to the emergency room but pronounced dead just 10 minutes later.</p> <p>According to Mumbai Police, two members of hospital staff have been arrested in connection with the horrific incident, which occurred at Nair Hospital over the weekend.</p> <p>“We have arrested a doctor and another junior staff member under section 304 of the Indian penal code for causing death due to negligence,” police spokesman Deepak Deoraj told AFP.</p> <p>According to the victim’s family, they had been assured that the machine was switched off and that the room was safe to enter.</p> <p>“I asked the ward boy thrice about the machine but he ridiculed me, saying that he knows his job well and does not need to know from me about it,” Maru’s relative Priyanka Solanki said.</p> <p>His uncle, Jitendra Maru, added, “The ward boy who was supposed to prevent such incidents told my family members to go inside when the machine was turned on. We are shocked and devastated.”</p> <p>An investigation is currently underway and Maru’s family have been awarded 500,000 rupees ($9,700) in compensation.</p>

Mind

Placeholder Content Image

Woman diagnosed with MS turns her brain scans into art

<p>After being diagnosed with Multiple Sclerosis in 1991, former lawyer Elizabeth Jameson decided to do something a bit different with her brain scans – turn them into art.</p> <p>The diagnosis came as a huge shock to Jameson, who first had her ability to speak in the late ‘80s due to a lesion in her brain. For the outspoken civil rights lawyer, becoming mute simply wasn’t an option. Through intense speech therapy, she regained her voice, before learning she had MS.</p> <p>As a champion for children with chronic illness and disabilities, Jameson decided to give back to the community she suddenly found herself a part of. “I was a public interest lawyer, so I decided to become a public interest artist, whatever the hell that would mean,” she told <a href="http://www.fastcodesign.com/3061436/diagnosed-with-multiple-sclerosis-an-artist-turns-her-mris-into-art" target="_blank"><strong><span style="text-decoration: underline;">Fast Company Design</span></strong></a>.</p> <p>When she received her first MRI scan, Jameson didn’t want to look at the harsh, “ugly” black-and-white images. So, she began silk painting and copper-etching her scans, creating colourful and lively works of art in an effort to “take the fear out of looking at MRIs”.</p> <p>Now quadriplegic, Jameson continues to make her incredible paintings with a little help from her assistant. Take a look at some of her stunning creations in the gallery above.</p> <p><strong>Related links:</strong></p> <p><a href="/entertainment/art/2016/05/artist-creates-fashion-designs-with-food/"><strong><em><span style="text-decoration: underline;">Artist creates incredible fashion designs using food</span></em></strong></a></p> <p><a href="/entertainment/art/2016/05/computer-creates-a-new-rembrandt-painting/"><em><strong><span style="text-decoration: underline;">Computer creates a new Rembrandt painting</span></strong></em></a></p> <p><a href="/entertainment/art/2016/05/artist-creates-paintings-using-bacteria/"><span style="text-decoration: underline;"><em><strong>Artist creates “paintings” using bacteria</strong></em></span></a></p>

Art

Our Partners