From Wikipedia, the free encyclopedia - View original article
Cognitive deficit, also known as cognitive impairment is an inclusive term to describe any characteristic that acts as a barrier to cognitive performance. The term may describe deficits in global intellectual performance, such as mental retardation, it may describe specific deficits in cognitive abilities (learning disorders, dyslexia), or it may describe drug-induced cognitive/memory impairment, such as that seen with alcohol and the benzodiazepines. Cognitive deficits may be congenital or caused by environmental factors such as brain injuries, neurological disorders, or mental illness.
There is evidence to indicate that parallel age trends have been similar over a vast amount of time, as with Schaie’s Seattle Longitudinal Study, which contained fifty years worth of data as well as seventy years in Weshsler tests. Studies such as these have proven the different patterns in age trends have two specific patterns in cognition, a constant decrease in measures of processing abilities such as reasoning, and stability followed by a steady decline for knowledge previously acquired.
A twelve-year study published in 2012 researched the effects of lifestyle activities on cognitive deficit in the hopes that a long study such as this could, to some degree, help older adults to ward off cognitive decline. The study looked at verbal speed, episodic memory, and semantic memory to be influenced by physical, social, and cognitive activities. The results of the study concluded that maintenance of cognitive functions with normal aging can be maintained by keeping an active lifestyle. The results also seemed to conclude that the correlation between lifestyle activities and cognition is not a simple one, as not all cognitive abilities were related to changes in the daily activities. Therefore, further research of longitudinal design observing the matches between activities and cognitive abilities in cognitively impaired adults could be critical.
Longitudinal studies using brain imaging have been done in an attempt to note early signs of cognitive decline on healthy individuals before changes noticed clinically occur. In this way the study attempted to find more biomarkers for early onset of degenerative diseases. The results from the study concluded that there are indeed patterns of brain abnormality that can be associated with brain decline.
In a cross-sectional study, researchers investigated patients with mood disorders in comparison with a control group to see if there was a difference on the effect of aging between the groups. These patients were tested with the Central Nervous System Vital Signs, a neurocognitive battery test. The study examined subjects from ages 18 to 90, noticing a sharp decline in the patients with mood disorder compared to the normal group when subjects were ages 65 and up. These declines were observed in the cognitive areas of attention, executive function, processing speed, and memory.
Although one would expect cognitive decline to have major affects on job performance, it seems that there is little to no correlation of health with job performance. With the exception of cognitive-dependent jobs such as air-traffic controller, professional athlete, or other elite jobs, age does not seem to impact one’s job performance. This obviously conflicts with cognitive tests given, so the matter has been researched further. One possible reason for this conclusion is the rare need for a person to perform at their maximum. There is a difference between typical functioning, that is – the normal level of functioning for daily life, and maximal functioning, what cognitive tests observe as our maximum level of functioning. As the maximum cognitive ability we are able to achieve decreases, this may not actually affect our daily lives which only require the normal level.
Study indicates that childhood hunger has a possible protective effect on cognitive decline. One possible explanation is the onset of age-related changes in the body can be delayed by calorie restriction. Another possible explanation is due to selective survival effect, as the study participants whom have had a childhood with hunger are the most healthiest of their era.
|This medical article is a stub. You can help Wikipedia by expanding it.|