The results showed heavy web users tend to be more depressed and show higher levels of autism traits
When people come off-line, they suffer increased negative mood - just like people coming off illegal drugs
By
Daily Mail Reporter 17 February 2013 Using the internet for hours on end
can result in withdrawal symptoms similar to the ‘comedown’ experienced
by drug users, scientists warned yesterday.
Researchers found spending excessive periods of time surfing the internet left people in ‘negative moods’. And, like drug addicts, when heavy internet users go back on the web their negative moods lift.
The research was carried out on 60
volunteers with an average age of 25 at Swansea University’s College of
Human and Health Sciences. Scientists say the results could mean society is in for some ‘nasty surprises’ if internet use increases as expected.
Professor Phil Reed, from the
university’s psychology department, said: ‘Our results show that around
half of the young people we studied spend so much time on the net that
it has negative consequences for the rest of their lives.
‘When people come offline, they suffer increased negative mood – just like people coming off illegal drugs like ecstasy. These initial results, and related
studies of brain function, suggest that there are some nasty surprises
lurking on the net for people’s wellbeing.’
The study, published in the
international journal Plus One, is the first of its kind into the
immediate negative psychological impacts of internet use.
A recent Bonn University study suggests we may all be living in a virtual simulation. If a pixel-lattice that forms the background of this universe is presenting us with an all-encompassing “television picture” of reality, then the whole space-time continuum could be a rigorously designed artifact.
But another study, this one using a small number of meditators, pushes our understanding even further.
Dean Radin, the author of two groundbreaking books on controlled paranormal experiments, The Conscious Universe and Entangled Minds, spoke at a January conference, Electric Universe, in New Mexico. He described his recent pilot study on time and precognition.
After a restless night of sleep, filled with nightmares where velociraptors and chainsaw-wielding maniacs chase you down, you wake up and wonder what caused such vivid, frightful dreams. Could it have been that spicy Thai food you had before bed?
Actually, there is some evidence that eating a spicy meal shortly before going to sleep can lead to some wacko dreams. In fact, eating anything too close to bedtime can trigger more dreams, because the late night snacks increase the body’s metabolism and temperature, explains Dr. Charles Bae, MD, a sleep medicine doctor at Sleep Disorders Center at the Cleveland Clinic. Heightened metabolism and temperature can lead to more brain activity, prompting more action during rapid eye movement sleep, or REM.
About every 90 minutes people experience rapid eye movement sleep as they cycle through the stages of sleep. In REM, when people dream the most, the body’s muscle tone slackens. During REM the brain becomes active, like it does when awake, and the eyes flutter behind the lids. Nightmares only happen during REM and while nightmares are simply dreams with negative emotions, they stand apart because they rouse the sleeper.
Carrie Steckl, Ph.D. Updated: Nov 30th 2012 Do you believe in the supernatural? Or do you think that all-things-ethereal are full of hogwash? If you're not sure - or if you're on the fence about such touchy matters - a brain scan might be able to help you out.
Researchers from Finland recently explored whether brain activity differed between supernatural believers and skeptics using functional magnetic resonance imaging (fMRI).
Hypnosis can be an effective means for treating phobias, managing
stress and anxiety, and even for managing pain, but all people are not
hypnotized equally. New research from Stanford
suggests that about one quarter of people cannot be hypnotized, and
using functional and structural MRI scientists there think they’ve
figured out why. Those people more apt to be hypnotized show more
activity in areas of the brain associated with executive control and
attention, while those showing less activity in those areas cannot be
put into a hypnotic state.
Belief in God is part of human nature - Oxford study
Humans are naturally predisposed to believe in gods and life after death, according to a major three-year international study.
By Tim Ross, Religious Affairs Editor8:17PM BST 12 May 2011
Led by two academics at Oxford University, the £1.9 million study found that human thought processes were “rooted” to religious concepts. But people living in cities in highly developed countries were less likely to hold religious beliefs than those living a more rural way of life, the researchers found.
The project involved 57 academics in 20 countries around the world, and spanned disciplines including anthropology, psychology, and philosophy. It set out to establish whether belief in divine beings and an afterlife were ideas simply learned from society or integral to human nature.
One of the studies, from Oxford, concluded that children below the age of five found it easier to believe in some “superhuman” properties than to understand human limitations. Children were asked whether their mother would know the contents of a closed box. Three-year-olds believed that their mother and God would always know the contents, but by the age of four, children start to understand that their mothers were not omniscient.
Separate research from China suggested that people across different cultures instinctively believed that some part of their mind, soul or spirit lived on after death. The co-director of the project, Professor Roger Trigg, from the University of Oxford, said the research showed that religion was “not just something for a peculiar few to do on Sundays instead of playing golf. We have gathered a body of evidence that suggests that religion is a common fact of human nature across different societies. This suggests that attempts to suppress religion are likely to be short-lived as human thought seems to be rooted to religious concepts, such as the existence of supernatural agents or gods, and the possibility of an afterlife or pre-life.”
A Cambridge University scientist says evil is a lack of empathy which can be measured and monitored and is susceptible to education and treatment.
"I'm not satisfied with the term 'evil'," Reuters quoted Cambridge University psychology and psychiatry professor Simon Baron-Cohen as saying.
"We've inherited this word... and we use it to express our abhorrence when people do awful things, usually acts of cruelty, but I don't think it's anything more than another word for doing something bad,” he added, saying that “we need a new theory of human cruelty.”
In his recent book Zero Degrees of Empathy, Baron-Cohen suggests a rebranding of evil and defining it in terms of lack of empathy.
Director of the Autism Research Center at Cambridge defines empathy in two parts; as the drive to identify other people's thoughts and feelings, and the drive to respond appropriately to those thoughts and feelings.
According to Baron-Cohen, if people fully use their capability to empathize many conflicts in families and society will be resolved.
"If you think about conflict resolution at the moment, usually we are dependent on diplomatic channels, legal frameworks, or military methods,” he said.
“But all those things operate at a very abstract level and they don't seem to get us very far.
"Empathy is about two people -- two people meeting, getting to know each other and tuning in to what the other person is thinking and feeling."
One of the world's top experts in autism and developmental psychopathology, Baron-Cohen cites at least ten brain regions which make up what he calls the "empathy circuit."
When people hurt others, parts of that circuit are malfunctioning. He also sets out an "empathy spectrum" ranging from zero to six degrees of empathy, and an "empathy quotient" test, which ranks people along that spectrum.
Baron-Cohen says people are in the middle of the spectrum, with a few particularly attuned and highly empathetic people at the top end.
He says those who fall at the bottom end of the scale should not be labeled evil, but should rather be seen as sick or "disabled," who need to be helped with their empathy deficiency.
"I try to keep an open mind. I would never want to say a person is beyond help," he says. "Empathy is a skill like any other human skill -- and if you get a chance to practice, you can get better at it."
Our brains react differently to others depending on how we view their social status, researchers say. The Current Biology study found those who see themselves as being of a high status display more brain activity with those they think are equally elevated. The researchers said behaviour was determined by how people saw those around them.
A British expert said first evaluations were crucial in determining how individuals related to each other. It was already known from other studies that monkeys behave this way; changing behaviour dependent on how they perceived the other animal's position in the troop. The 23 participants, who had varying levels of social status, were shown information about someone of higher status and information about someone of lower status.
The team used functional magnetic resonance imaging (fMRI) to measure activity in the ventral striatum, part of the brain's reward system. People who viewed themselves as having a higher subjective socioeconomic status displayed greater brain activity in response to other high-ranked individuals, while those with lower status have a greater response to other low-status individuals.
First evaluations
Dr Caroline Zink, of the US National Institute of Mental Health, who led the study, said: "The way we interact with and behave around other people is often determined by their social status relative to our own, and therefore information regarding social status is very valuable to us. "Interestingly, the value we assign to information about someone's particular status seems to depend on our own." She added that socioeconomic status is not based solely on money, but can also include factors such as accomplishments and habits.
Music releases a chemical in the brain that has a key role in setting good moods, a study has suggested.
The study, reported in Nature Neuroscience, found that the chemical was released at moments of peak enjoyment.
Researchers from McGill University in Montreal said it was the first time that the chemical - called dopamine - had been tested in response to music.
Dopamine increases in response to other stimuli such as food and money.
It is known to produce a feel-good state in response to certain tangible stimulants - from eating sweets to taking cocaine.
Dopamine is also associated with less tangible stimuli - such as being in love.
In this study, levels of dopamine were found to be up to 9% higher when volunteers were listening to music they enjoyed.
The report authors say it's significant in proving that humans obtain pleasure from music - an abstract reward - that is comparable with the pleasure obtained from more basic biological stimuli.
Music psychologist, Dr Vicky Williamson from Goldsmiths College, University of London welcomed the paper. She said the research didn't answer why music was so important to humans - but proved that it was.
"This paper shows that music is inextricably linked with our deepest reward systems."
It sounds like something from a science fiction movie. But researchers reckon they have found a way to erase painful memories and post-traumatic stress. They discovered a link between a protein called PKM and recollections of disturbing incidents.
By targeting the specific brain circuit which holds the tormenting memory they believe they could weaken it or wipe it out. The incredible study paves the way for treatment for war veterans and victims of horrific attacks. It may also help drug addicts and people with long-term memory disorders such as Alzheimer’s.
Professor David Glanzman, the study’s senior author, said: “I think it will be feasible."
‘ANXIETY PROTEIN’ THAT HOLDS KEY TO A CURE FOR STRESS
The British team has found that the brain releases an “anxiety protein” when exposed to stress
Thursday April 21,2011
By Victoria Fletcher Health Editor
SCIENTISTS have made a breakthrough in their understanding of stress.
It could lead to new treatments for the one in three people who suffer from stress disorders and depression. And it could help to explain why some people seem to suffer from anxiety more easily than others.
The British team has found that the brain releases an “anxiety protein” when exposed to stress. Levels of this protein neuropsin appear to dictate how we react to such situations. The researchers believe that targeting it or the gene that produces it could manipulate how we respond to stress and people with conditions it causes. In severe cases stress can lead to long term damage, problems with depression and post-traumatic stress.
Lead scientist Dr Robert Pawlak, from the University of Leicester, said: “Stress-related disorders affect a large percentage of the population and generate an enormous personal, social and economic impact. It was previously known that certain individuals are more susceptible to detrimental effects of stress than others. Although the majority of us experience traumatic events, only some develop stress-associated psychiatric disorders such as depression, anxiety or post-traumatic stress disorder. The reasons for this were not clear.”
The research, reported in the journal Nature, showed that a part of the brain that controls emotional responses, called the amygdala, reacts to stress by boosting levels of neuropsin. This in turn triggers a series of chemical events that causes the amygdala to increase activity. Neuropsin interacted with two cell membrane proteins to activate a specific gene that regulates stress response. Further work revealed a link between the neuropsin pathway and the way mice behaved in a maze. Stressed animals stayed away from open, illuminated zones in the maze where they felt exposed and unsafe. But when their amygdala proteins were blocked, either by drugs or gene manipulation, the mice appeared to become immune to stress.
Dr Pawlak said: “We conclude that the activity of neuropsin and its partners may determine vulnerability to stress. We are tremendously excited about these findings. We know that all members of the neuropsin pathway are present in the human brain. They may play a similar role in humans and further research will be necessary to examine the potential of intervention therapies for controlling stress-induced behaviours."
Time to spring clean... your mind? Scientists say memory lapses can be blamed on too much irrelevant information
By Fiona Macrae, Mail Online
Last updated at 8:00 AM on 21st April 2011
If you struggle to remember names and numbers or frequently fail to follow the plot of a film, help could be at hand. Scientists say the problem is that you know too much – and you need to declutter, or spring-clean your mind.
Experiments show that the memory lapses that come with age are not simply due to brain slowing down. Instead, they can be blamed on the well-used brain finding it more and more difficult to stop irrelevant information interfering with the task in hand.
The first step in the study was to compare the working memory of the young and old. Working memory involves holding information in mind while manipulating it mentally.
Examples in everyday life include retain plots of films and books to understand or predict what will happen next and following the thread of a conversation while working out how you can contribute to the topic.
In the context of the study, it involved giving the volunteers groups of sentences and asking them to work out whether each line made sense – and to remember the last word of each sentence.
Rest: Getting a good night's sleep is just one way to spring clean the mind
Overall, the younger people, who had an average age of 23, did better, the Quarterly Journal of Experimental Psychology reports. The Canadian researchers then did a second experiment to see what was hindering the older volunteers, who had an average age of 67. This involved being shown a pictures of eight animals and being asked to memorise the order in which the creatures appeared. The volunteers were then shown dozens of the pictures and asked to click on their computer mouse when the first animal in their memorised sequence occurred, then the second and so on.
The older adults found it more difficult to progress, suggesting the previous picture was stuck in their mind. Mervin Blair, of Montreal’s Concordia University, said: ‘We found that the older adults had more difficulty in getting rid of previous information. ‘We found that that accounted for a lot of the working memory problems seen in the study.’
A third study confirmed that the memory problems were not simply due to a simple slowing down of the mind. Mr Blair, a PhD candidate, says that the older mind appears to have trouble suppressing irrelevant information. This makes it more difficult to concentrate on the here and now.
The world is set to experience the biggest full moon for almost two decades when the satellite reaches its closest point to Earth next weekend. On 19 March, the full moon will appear unusually large in the night sky as it reaches a point in its cycle known as 'lunar perigee'.
Stargazers will be treated to a spectacular view when the moon approaches Earth at a distance of 221,567 miles in its elliptical orbit - the closest it will have passed to our planet since 1992. The full moon could appear up to 14% bigger and 30% brighter in the sky, especially when it rises on the eastern horizon at sunset or is provided with the right atmospheric conditions.
This phenomenon has reportedly heightened concerns about 'supermoons' being linked to extreme weather events - such as earthquakes, volcanoes and tsunamis. The last time the moon passed close to the Earth was on 10 January 2005, around the time of the Indonesian earthquake that measured 9.0 on the Richter scale.
Hurricane Katrina in 2005 was also associated with an unusually large full moon. Previous supermoons occurred in 1955, 1974 and 1992 - each of these years experienced extreme weather events, killing thousands of people.
However, an expert speaking to Yahoo! News today believes that a larger moon causing weather chaos is a popular misconception. Dr Tim O'Brien, a researcher at the Jodrell Bank Centre for Astrophysics at the University of Manchester, said: "The dangers are really overplayed. You do get a bit higher than average tides than usual along coastlines as a result of the moon's gravitational pull, but nothing so significant that will cause a serious climatic disaster or anything for people to worry about."
But according to Dr Victor Gostin, a Planetary and Environmental Geoscientist at Adelaide University, there may be a link between large-scale earthquakes in places around the equator and new and full moon situations. He said: "This is because the Earth-tides (analogous to ocean tides) may be the final trigger that sets off the earthquake."
A study from Stony Brook University has suggested that earliest humans were not very different from us. Archaeologist John Shea believes that experts have been focusing on the wrong measurement of early human behaviour - 'behavioural modernity' instead of 'behavioural variability.'
Behavioural modernity is a quality supposedly unique to Homo sapiens, while behavioural variability is a quantitative dimension to the behaviour of all living things. For a long time, the European Upper Paleolithic archaeological record has been the standard against which the behaviour of earlier and non-European humans is compared.
During the Upper Paleolithic (45,000-12,000 years ago), Homo sapiens fossils first appear in Europe together with complex stone tool technology, carved bone tools, complex projectile weapons, advanced techniques for using fire, cave art, beads and other personal adornments.
The same behaviours are either universal or very nearly so among recent humans, leading archaeologists to cite evidence for these behaviours as proof of human behavioural modernity but Shea said that the oldest Homo sapiens fossils occur between 100,000-200,000 years ago in Africa and southern Asia and in contexts lacking clear and consistent evidence for such behavioural modernity.
Archaeologists disagree about the causes, timing, pace, and characteristics of this revolution, but there is a consensus that the behaviour of the earliest Homo sapiens was significantly that of more-recent "modern" humans.
The way in which people frantically communicate online via Twitter, Facebook and instant messaging can be seen as a form of modern madness, according to a leading American sociologist.
"A behaviour that has become typical may still express the problems that once caused us to see it as pathological," MIT professor Sherry Turkle writes in her new book, Alone Together, which is leading an attack on the information age.
Turkle's book, published in the UK next month, has caused a sensation in America, which is usually more obsessed with the merits of social networking. She appeared last week on Stephen Colbert's late-night comedy show, The Colbert Report. When Turkle said she had been at funerals where people checked their iPhones, Colbert quipped: "We all say goodbye in our own way."
Turkle's thesis is simple: technology is threatening to dominate our lives and make us less human. Under the illusion of allowing us to communicate better, it is actually isolating us from real human interactions in a cyber-reality that is a poor imitation of the real world.
But Turkle's book is far from the only work of its kind. An intellectual backlash in America is calling for a rejection of some of the values and methods of modern communications. "It is a huge backlash. The different kinds of communication that people are using have become something that scares people," said Professor William Kist, an education expert at Kent State University, Ohio.
The list of attacks on social media is a long one and comes from all corners of academia and popular culture. A recent bestseller in the US, The Shallows by Nicholas Carr, suggested that use of the internet was altering the way we think to make us less capable of digesting large and complex amounts of information, such as books and magazine articles. The book was based on an essay that Carr wrote in the Atlantic magazine. It was just as emphatic and was headlined: Is Google Making Us Stupid?
Another strand of thought in the field of cyber-scepticism is found in The Net Delusion, by Evgeny Morozov. He argues that social media has bred a generation of "slacktivists". It has made people lazy and enshrined the illusion that clicking a mouse is a form of activism equal to real world donations of money and time.
Other books include The Dumbest Generation by Emory University professor Mark Bauerlein – in which he claims "the intellectual future of the US looks dim"– and We Have Met the Enemy by Daniel Akst, which describes the problems of self-control in the modern world, of which the proliferation of communication tools is a key component.
The backlash has crossed the Atlantic. In Cyburbia, published in Britain last year, James Harkin surveyed the modern technological world and found some dangerous possibilities. While Harkin was no pure cyber-sceptic, he found many reasons to be worried as well as pleased about the new technological era. Elsewhere, hit film The Social Network has been seen as a thinly veiled attack on the social media generation, suggesting that Facebook was created by people who failed to fit in with the real world.
Turkle's book, however, has sparked the most debate so far. It is a cri de coeur for putting down the BlackBerry, ignoring Facebook and shunning Twitter. "We have invented inspiring and enhancing technologies, yet we have allowed them to diminish us," she writes.
Fellow critics point to numerous incidents to back up their argument. Recently, media coverage of the death in Brighton of Simone Back focused on a suicide note she had posted on Facebook that was seen by many of her 1,048 "friends" on the site. Yet none called for help – instead they traded insults with each other on her Facebook wall.
Expert board game players utilise specific brain areas
BBC NEWS Health
21 January 2011Last updated at 13:06
Scientists have discovered that expert board game players use a part of their brain that amateurs fail to utilise. The research, published in Science, involved scanning the brains of both professional and amateur Japanese "Shogi" players. Shogi is a Japanese game, similar to chess.
Scientists from the RIKEN Brain Science Institute in Japan said that intuitive playing was probably not due to nature, but brain training. Shogi is a very popular game in Japan, played to professional level. Professional players train for up to ten years, three to four hours a day to achieve the level of expertise needed to play professionally.
Intuitive decisions
They are able to make very quick "intuitive" decisions about which move in any combination on the board, would produce the best outcome.
The researchers recruited 30 professional shogi players from the Japanese Shogi Association. They also had a control group of amateur players. The professional players were presented with a game of shogi already in progress and given 2 seconds to choose the next best move - from a choice of four moves. The researchers found that there were significant activations in the caudate nucleus area of the brains of professional players while they were making their quick moves.
Brain activity
In contrast, when amateur players were asked to quickly find the next best move, there was no significant activation in the caudate nucleus. This brain activity was specific to professional players who were making quick decisions about the next best move.
In addition, professionals did not use that area of the brain when they were given a longer time of 8 seconds, to think strategically about further moves they could make. In this scenario, the caudate nucleus area of the brain was not activated.
The caudate nucleus area of the brain was historically thought to be involved with the control of voluntary bodily movements. However more recently it has also been associated with learning and memory.
Research reveals the biochemical connection between music and emotion
January 19, 2011 By Joel N. Shurkin
You are in a concert hall, listening to music you love, Ludwig von Beethoven's Ninth Symphony. You are happily awaiting the glorious climax in the fourth movement -- you know it's coming -- when the full orchestra and chorus erupt with the "Ode to Joy." The moment is here and you are exhilarated, awash in a sudden wave of pleasure.
When music sounds this good, there's a reason: dopamine.
In research published in the journal Nature Neuroscience, scientists at McGill University in Montreal have established the direct link between the elation stimulated by music and the neurotransmitter dopamine. Dopamine is the same substance that puts the joy in sex, the thrill in certain illegal drugs, and the warm feeling within a woman breast-feeding her child. The substance also may explain why the power of music crosses human cultures, the scientists said.
Valorie N. Salimpoor and other researchers in the lab of Robert J. Zatorre took eight subjects and asked them to bring in music they loved. They chose a broad range of instrumental music, from Samuel Barber's Adagio for Strings (the most popular) to jazz and punk. The test used only familiar music, Zatorre said, because he wanted to make sure he was getting a "maximal response." What the subjects had in common was that the music they brought in gave them the "chills," which is actually a technical term for a kind of emotional response. A positron emission tomography, or PET scan, measured dopamine release.
Dopamine is synthesized in the brain out of amino acids and transmits signals from one neuron to another through the circuits of the brain. The structure in the brain Zatorre's team looked at is the striatum, deep inside the forebrain. The striatum has two subparts: the upper, or dorsal, and the ventral below. Zatorre said the dorsal part of the striatum is connected to the regions of the brain involved in prediction and action, while the ventral is connected to the limbic system, the most primitive and ancient part of the brain, where emotions come from. "When you are anticipating, you are engaging the prediction part of the brain; when you feel the chills, that's emotion," Zatorre said, whose team found that the dopamine triggered both parts. According to the McGill research, during the anticipation phase dopamine pours into the dorsal striatum when the climax occurs, triggering a reaction in the ventral striatum that results in a release of pure emotion.
The idea that there was some biochemical reaction involved goes back to the work of the late Leonard B. Meyer in the 1950s. Meyer was a musicologist not a scientist, but he connected music theory with psychology and neuroscience, emotional response to music patterns. He did not know the biochemical mechanism. Great composers don't know it either but play on this process. German composer Gustav Mahler is famous for creating tension that needs resolution, building intensity until the orchestra explodes in a wave of sound.
The listener knows there is going to be an emotional resolution even if the piece is unfamiliar. And, if the listener knows it is coming, the reaction can be even more intense. It turns out, said Zatorre, that Mahler -- and conductors performing his music -- play with the emotions of the audience by manipulating dopamine. "What we're finding is that this is the brain mechanism that underlies this phenomenon," Zatorre said.