New parents can often be brought to their wits’ end trying to figure out whether their newborn is sick, tired, hungry, in need of a hug, or perhaps something more serious.
Thankfully, a team of US researchers has developed a new artificial intelligence method that can distinguish babies’ cries from one another, as well as differentiate an individual baby’s critical and non-critical cries from each other e.g. whether the child is in distress or just hungry.
Using a proprietary cry language algorithm, based on automatic speech recognition, the researchers can detect hidden patterns in the infant’s cry using a compressed sensing technique which reconstructs the initial raw cry signal based on sparse data input.
This is incredibly useful for analyzing the babies’ cries, especially in noisy environments, which is where they are most likely to be recorded i.e. nursing wards, maternity hospitals, in a chaotic home environment as desperate parents try to remember where they left the baby powder etc.
The algorithm can classify and log various cry features which then provide insight into why babies are crying and how urgent their situation is, crucial information for parents and healthcare workers alike.
“Like a special language, there are lots of health-related information in various cry sounds. The differences between sound signals actually carry the information,” explained Lichuan Liu of the Digital Signal Processing Laboratory which conducted the research.
The ultimate goals are healthier babies and less pressure on parents and caregivers.
The research is published in the May issue of the Journal of Automatica Sinica (JAS).
Think your friends would be interested? Share this story!
Helsinki-based Silo.AI, which claims to be the largest private AI services provider in the Nordics, has raised €10 million in funding to build out its platform and expand its team.
The cash, which Silo.AI says will be used to turn it into “the European flagship AI company”, comes from the Business Finland Growth Engine program and a number of unnamed angel investors.
Founded in 2017, Silo.AI currently boasts a team of 40 people building end-to-end customisable AI solutions and working as consultants within client teams.
“We are excited about the possibility to take our services operation and platform to the next level. We see the AI market as a service market, in which the largest share of value derives from tailoring solutions to client requirements. The funding will allow us to accelerate our expansion to the European market,” said Peter Sarlin, CEO and co-founder of Silo.AI.
Notably, the startup was instrumental in the formation of the Nordic AI Alliance, which aims to contribute to accelerating adoption of AI technology in businesses and organisations throughout the Nordic countries.
Nine days before the World Health Organization alerted the world to the threat posed by COVID-19, an artificial intelligence-powered startup led by the University of Toronto’s Kamran Khan had already spotted the first signs of an unusual outbreak.
In an interview with CNBC, Khan explained how his company, BlueDot, was able to scour big data and spot the emergence of the novel coronavirus before anyone else.
He said BlueDot uses machine learning and natural language processing to comb through masses of data, which are then reviewed by doctors and computer programmers who create threat reports.
“We don’t use artificial intelligence to replace human intelligence, we basically use it to find the needles in the haystack and present them to our team,” said Khan, an associate professor at the Institute of Health Policy, Management and Evaluation at the Dalla Lana School of Public Health and an infectious disease physician at St. Michael’s Hospital.
He said his experience treating patients during the SARS outbreak in 2003 inspired him to start BlueDot.
“What I learned during SARS is, let’s not get caught flatfooted, let’s anticipate rather than react.”
Of all the ways artificial intelligence is being adopted around the world, this is definitely one of the weirdest. Dominos is launching a pizza checker which will automatically scan and identify bung pizzas before they leave the store.
Swipe to close
The company has called the technology the DOM Pizza Checker, which is essentially a camera which sits above the store cutting board and scopes out pizzas as they’re made. It compares what it sees to a big database of pizzas and if it’s not up to scratch, it tells the staff to make it again.
It takes into consideration the distribution of toppings, cheese, and just the general look of the pizza to make sure it’s up to a high standard. At some point later this year, the store will even send you a goddamn picture of your pizza, letting you know whether it’s passed the test or is being remade.
The rollout of this new system is aimed at addressing the biggest complaint Dominos receives, which is, “my pizza doesn’t look like it should”.
“For anyone who has ever been disappointed with their pizza for any reason – maybe there was hardly any pepperoni on it, or not enough cheese – rest assured, we have heard you! And we’re determined to make it right,” Domino’s ANZ CEO, Nick Knight, said in a release.
The DOM has been in development with Dragontail Systems for the past two years and is now being used in all Dominos stores across Australia and New Zealand.
It’s a fairly full-on move as far as pizza making goes, but I guess getting a lopsided one is pretty shitty. So, uh, all hail our new AI pizza inspector, DOM.
Because of the ease that coronavirus spreads from person to person, testing has been identified as one of the best ways to control the disease. But testing has cost and resource limitations. Applying AI and robot technology could help overcome these problems, while reducing medical practitioners’ exposure to the virus.
The technology has proved successful in medical trials, including identifying cancer in breast scans.
A research paper from Google Health, published in Nature magazine, has reported that machine learning, based on Google’s TensorFlow algorithm, can be used to reduce false positives in breast cancer scans. A false positive is when a mammogram scan is incorrectly identified as cancerous, and a false negative is when it is wrongly diagnosed as not being cancerous.
Read more about artificial intelligence in healthcare
Corporate bosses don’t talk about it in public, but among themselves – psssst – they whisper excitedly about implementing a transformative “AI agenda” across our economy.
AI stands for artificial intelligence, the rapidly-advancing digital technology of creating thinking robots that program themselves, act on their own, and even reproduce themselves. These automatons are coming soon to a workplace near you!
Enjoying Hightower? How about a weekly email that gives you the full scoop?
Not wanting to stir a preemptive rebellion by human workers, corporate chieftains avoid terms like automation of jobs, instead substituting euphemisms like “digital transformation” of work. Privately, however, top executives see AI as their path to windfall profits and personal enrichment by replacing whole swaths of their workforce with an automated army of cheap machines that don’t demand raises, take time off, or form unions.
As one prominent tech exec confided to the New York Times, AI “will eliminate 40 percent of the world’s jobs within 15 years.” Some CEOs are so giddy about AI’s profiteering potential that they openly admit their intentions. Take Foxconn, the Taiwanese electronics giant hailed as a job creating savior last year by Donald Trump. It was given $3 billion in public subsidies to open a huge manufacturing plant in Wisconsin, but it’s now reneging and declaring that it intends to replace 80 percent of its global workforce with robots within 10 years. Corporate apologists say displaced humans can be “reskilled” to do something else. But what? Where? When? No response.
Executives try to skate by the human toll by saying that the machine takeover is the inevitable march of technological progress. Hogwash! There’s nothing “natural” about the AI agenda – it’s a choice being made by an elite group of corporate and political powers trying to impose their selfish interests over us.
Neurons.AI (the new home of into.AI) is an index of all information resources on Artificial Intelligence and Machine Learning related topics Check out the Directory, News or Channels menu options above or Scroll down to learn more about what we do
Today every person on Earth has been affected by the COVID-19 outbreak. Airplanes are grounded, borders are closed, businesses are shut, citizens are forced into quarantine, and governments are taking spontaneous emergency decisions undermining the principles of democracy.
All this, if not stopped shortly, can lead to chaos and unrests.
Currently HTTP://WWW.ST-lF.COM is proud to represent the world-wide scientific community, by fundraising for COVID-19 Vaccine Development.
[Link to Full Article] read more
It takes guts to make a science-fiction/horror film that visually recalls Alienand name its repository of artificial intelligence “Mother.” (Technically, the Nostromo’s computer, voiced by Helen Horton, is the MU-TH-UR 6000.) In this particular case, however, that choice isn’t just a cute homage. I Am Mother’s title character may be a robot, but she’s also genuinely a parent—perhaps the last one on Earth. Opening text informs us that the cavernous, futuristically high-tech building in which most of the film will take place is a repopulation facility, and that it’s been just one day since some unspecified “extinction event;” the facility houses 63,000 human embryos, with which this single robot, Mother (who’s physically performed by Luke Hawker but speaks in the warm yet slightly mechanical voice of Rose Byrne), will ensure that Homo sapienslives on despite whatever cataclysm happened outside its ultra-thick, reinforced walls.
Initially, though, she grows only one infant, who she calls simply Daughter (briefly seen as a small child, but primarily embodied at roughly age 20 by Clara Rugaard). It’s not entirely clear why the repopulation effort is going so slowly—Mother explains at one point that she must learn how to care for one child before creating others, but that starts to seem less plausible once Daughter is nearly old enough to drink (were there still any bars). Another vaguely suspicious aspect of their life together involves Daughter’s education, which seems rather heavy on difficult ethical/philosophical questions and culminates in an ostensible exam that resembles the Minnesota Multiphasic Personality Inventory. In any case, the true test comes when Daughter hears someone banging on one of the facility’s doors, and finds a woman (Hilary Swank) who’s been shot in the abdomen—by a robot exactly like Mother, she claims. Since Mother has always maintained that nobody outside the facility survived, the very existence of another human being is itself a shock, and Daughter decides not only to let the woman in but to keep her hidden.
That isn’t I Am Mother’s only secret, by a long shot. This impressively ambitious first feature, directed by Grant Sputore and written by Michael Lloyd Green (from a story that he and Sputore jointly conceived), has more than one agenda, and doesn’t quite succeed in making them cohere. On one level, it’s an allegory about everyday parenting, creating an extreme variation on the tumultuous moment when every child has to assert her independence and become an autonomous adult. On another level, though, it’s about the potential dangers of creating a fully conscious machine—even one that’s been carefully programmed to make saving human lives its utmost priority. Those two themes aren’t mutually exclusive, but Sputore and Green work hard to keep one of them murky for as long as possible (though anyone who pays attention to certain details and can do basic math will realize early on that something’s up); while that creates suspense, it also undermines what should be the ending’s cathartic power. So many truly disturbing revelations pile up in the final half hour or so that processing the relevant information leaves little time for raw emotion. Swank’s nameless character, in particular, remains a pencil sketch.
Still, there’s no question that Sputore can direct a movie. I Am Mother (which premiered at Sundance earlier this year and was snapped up by Netflix) can’t have had much of a budget, by sci-fi standards, and it does look a tad chintzy when it eventually moves outside. The repopulation facility, however, has been richly imagined by production designer Hugh Bateup, who’s worked extensively with the Wachowskis (going back to The Matrix,on which he served as art director—basically the production designer’s second-in-command) and here finds novel ways to make the immaculate look sinister. And Mother herself is a remarkable creation, both in appearance (she actually looks strangely like Emmet from the LegoMovies: a conglomeration of rounded rectangles capped with two beady round eyes, which swivel on tight arcs to fashion expressions) and in movement (a combination of graceful and lumbering that likely couldn’t be achieved via digital effects). Rugaard, a relative newcomer who previously played a supporting role in Teen Spirit, is the sole (visible) human onscreen for much of the film, and has strong natural presence, though Daughter never really seems as if she’s lived her entire life with just a humanoid robot for company (plus clips from the Carson-era Tonight Show,for some reason). The film just has too much other stuff going on to delve into what it’d be like growing up as the first and so far only member of Humanity 2.0. That’s a failing, but an eminently forgivable one. Far better to have too many heady ideas than too few.
Just when you thought BMW owners couldn’t be more infatuated with their cars, they’ll soon be interacting with the car on a personal level.
Microsoft are designing a multi-modal Intelligent Personal Assistant for BMW, using an open-source platform.Â Â The Microsoft Azure-powered BMW Open Mobility Cloud, combined with artificial intelligence technology, will constantly enhance the personal assistant’s capabilities to interact with users.
âMicrosoftâs Virtual Assistant Solution Accelerator built on Azure provides the necessary technological basis, leveraging Microsoft Azure cloud and AI capabilities such as Bot Framework and Cognitive Services.â
The Intelligent Personal Assistant will soon perform more complex tasks than climate control.Â You can soon use the voice-controlled personal assistant to safely manage e-mails and calendar appointments while you’re driving.
To make an appointment with a BMW dealership, you’ll be able to just ask your “co-driver” to book it for you.Â The AI assistant will arrange the whole appointment, and finish a reminder of when service is due.
The system is constantly expanding its capabilities as part of regular updates, so the aforementioned features are yet to be developed even further.
Check out a quick introduction to the updated AI technology below.