New study data presented today at the Alzheimer’s Association International Conference® 2014 (AAIC®) showed that a positive [18F]flutemetamol PET scan for brain amyloid was a highly significant predictor of progression from amnestic Mild Cognitive Impairment (aMCI) to probable Alzheimer’s disease (pAD).[i] A second study further demonstrated the diagnostic value of [18F]flutemetamol in confirming the presence of neuritic amyloid in those patients with early onset dementia.[ii] [18F]Flutemetamol is GE Healthcare’s investigational radiopharmaceutical product for PET imaging of beta amyloid neuritic plaque density in the brains of adult patients with cognitive impairment who are being evaluated for Alzheimer’s disease and other causes of cognitive impairment.

“Collectively, these data demonstrate the diagnostic value of [18F]flutemetamol and add to the growing body of evidence that it can help physicians identify the histopathology associated with an Alzheimer’s disease diagnosis in specific patients,” said Ger Brophy, PhD, Chief Technology Officer, Life Sciences, GE Healthcare. “If and when approved in the European Union, [18F]flutemetamol will be an important tool to support assessments of patients with cognitive disorders, as well as being a valuable research tool in the hunt for therapies to combat Alzheimer’s.”

Use of [18F]Flutemetamol PET Scans as Indicator of Progression from Amnestic Mild Cognitive Impairment to Probable Alzheimer’s Disease

In the study, 232 participants with MCI, a diagnosis characterized by cognitive deficits not severe enough to impact daily functioning and thus not meeting the definition of dementia, received a [18F]flutemetamol injection and underwent brain scans. The study showed those with positive VIZAMYL scans were approximately 2.5 times more likely to convert to pAD than those with negative scans. The ability of positive [18F]flutemetamol PET images to identify aMCI patients at higher risk of progressing to AD could potentially allow for better patient evaluation and management, and support patient stratification in enrolment into clinical trials for disease modifying drugs

“These findings demonstrate the potential role of [18F]flutemetamol in stratifying those patients at higher risk of developing Alzheimer’s disease, beyond its use as a diagnostic tool,” said David Wolk, MD, Assistant Director, Penn Memory Center and Lead Investigator of the study. “In addition to providing patients with potentially important prognostic information about their likelihood of developing dementia, identifying high risk patients could help guide physicians’ recommendations for patient monitoring, care plans and use diagnostic resources. These are exciting results, but we need further research to fully understand how this might be used in clinical practice.”

Diagnostic Value of [18F]Flutemetamol) in Early Onset Dementia

In this study, 80 patients with early onset dementia (younger than age 70), and with physician diagnostic confidence less than 90 percent, underwent [18F]flutemetamol PET scans which were assessed as either amyloid positive or negative. Clinical diagnosis and diagnostic confidence was determined, both before and after disclosure of the scan results. This study demonstrated that the use of [18F]flutemetamol increased diagnostic confidence for physicians, and in many patients, helped to confirm or exclude the diagnosis of Alzheimer’s disease and lead to changes in management.

“Early and accurate diagnoses may have implications for both prognosis and treatment among patients with early onset dementia,” said Dr. Marissa Zwan, VU University Medical Center, Amsterdam and Lead Investigator of the study. “Greater diagnostic confidence supports better patient management and helps physicians to determine appropriate treatment options, as well as helping patients and caregivers to plan for the future.”

The data showed that 20 percent of patients had a change in diagnosis following review of the [18F]flutemetamol scan. In particular, for those patients diagnosed with Alzheimer’s disease prior to a [18F]flutemetamol scan and who had an amyloid negative result, clinical diagnosis changed in 12 of 15 patients. Overall, confidence in diagnosis significantly increased from 67(±12) percent to 90(±16) percent after disclosing PET results. Additionally, in 48 percent of patients, 18F]flutemetamol PET results led to a change in patient healthcare management (i.e. medication changes, additional care).

In October 2013, [18F]flutemetamol received approval from the U.S. Food and Drug Administration, where it is marketed as VIZAMYL™, for Positron Emission Tomography (PET) imaging of the brain to estimate beta amyloid neuritic plaque density in adult patients with cognitive impairment who are being evaluated for Alzheimer’s disease (AD) or other causes of cognitive decline. VIZAMYL is for diagnostic use only and should be used in conjunction with a clinical evaluation. VIZAMYL is not licensed in any market for estimating the risk of MCI progression to clinical AD.

In June 2014, [18F]flutemetamol received a positive opinion from the European Medicines Agency’s Committee for Medicinal Products for Human Use (CHMP), recommending the granting of a marketing authorization for PET imaging of beta amyloid neuritic plaque density in the brains of adult patients with cognitive impairment who are being evaluated for Alzheimer’s disease and other causes of cognitive impairment. It is not yet approved for use in Europe or Japan.

GE HEALTHCARE’S COMMITMENT TO IMAGING RESEARCH

[18F]Flutemetamol is one component of a broad portfolio of investigational diagnostic solutions that GE Healthcare is currently developing in the neurology field. GE Healthcare is taking a comprehensive approach to understanding dementia and AD through its ongoing research to uncover the causes, risks, and physical effects of the disease. GE Healthcare offers a broad portfolio of imaging resources including cyclotrons and chemistry systems to manufacture PET imaging agents, PET and MR scanners to scan patients, and is developing image analysis software to provide quantification, optimized visualization and reporting tools.

Additionally, GE Healthcare is collaborating with the pharmaceutical industry to assist in their development of the next generation of therapies. To that end, we are working with potential partners in the industry to understand their strategic needs, and helping to provide imaging support for clinical trials of therapeutic agents.

[i] Wolk DA, Duara R, Sadowsky C, et al. : [18F]Flutemetamol Amyloid PET Imaging: Outcome of Phase III Study in Subjects with Amnestic Mild Cognitive Impairment after 3 Year Follow Up. Data presented at Alzheimer’s Association International Conference® 2014
[ii] Zwan MD, Bouwman FH, Lammertsma AA, et al. Clinical Impact of [18F]Flutemetamol PET in Young Onset Dementia. Data presented at Alzheimer’s Association International Conference® 2014

Source: GE Healthcare

A 70-foot-long, 52-ton concrete bridge survived a series of earthquakes in the first multiple-shake-table experiment in the University of Nevada, Reno’s new Earthquake Engineering Lab, the newest addition to the world-renowned earthquake/seismic engineering facility.

“It was a complete success. The bridge withstood the design standard very well and today went over and above 2.2 times the design standard,” John Stanton, civil and environmental engineering professor and researcher from the University of Washington, said. Stanton collaborated with Foundation Professor David Sanders of the University of Nevada, Reno in the novel experiment.

“The bridge performed very well,” Sanders said. “There was a lot of movement, about 12 percent deflection – which is tremendous – and it’s still standing. You could hear the rebar inside the columns shearing, like a zipper opening. Just as it would be expected to do.”

The set of three columns swayed precariously, the bridge deck twisted and the sound filled the cavernous laboratory as the three 14- by 14-foot, 50-ton-capacity hydraulically driven shake tables moved the massive structure.

“Sure we broke it, but we exposed it to extreme, off-the-scale conditions,” Stanton said. “The important thing is it’s still standing, with the columns coming to rest right where they started, meaning it could save lives and property. I’m quite happy.”

The bridge was designed and the components were pre-cast at the University of Washington in Seattle, and then built atop three 14- by 14-foot, 50-ton-capacity hydraulically driven shake tables in the 24,500 square-foot lab. It was shaken in a series of simulated earthquakes, culminating in the large ground motions similar to those recorded in the deadly and damaging 1995 magnitude 6.9 earthquake in Kobe, Japan.

The rocking, pre-tensioned concrete bridge support system is a new bridge engineering design the team has developed with the aim of saving lives, reducing on-site construction time and minimizing earthquake damage.

“By building the components off-site we can save time with construction on-site, minimizing interruptions in traffic and lowering construction costs,” Sanders said. “In this case, the concrete columns and beams were pre-cast and tensioned at the University of Washington. Other components were built here at the University of Nevada, Reno. It took us only a month to build the bridge, in what would otherwise be a lengthy process.”

“This can’t be done anywhere else in the nation, and perhaps the world,” Ian Buckle, director of the lab and professor of civil engineering, said of the test. “Of course we’ve been doing these types of large-scale structures experiments for years, but it’s exciting to have this first test using multiple tables in this building complete. It’s good to see the equipment up and running successfully.

When combined with the University’s Large-Scale Structures Laboratory, just steps away from the new lab, the facility comprises the biggest, most versatile large-scale structures, earthquake/seismic engineering facility in the United States, according to National Institute of Standards and Technology, and possibly the largest University-based facility of its kind in the world.

A grand opening was held recently for the $19 million lab expansion project, funded with $12.2 million by the U.S. Department of Commerce’s National Institute of Standards and Technology, funds from the Department of Energy, as well as University and donor funds. The expansion allows a broader range of experiments and there is additional space to add a fifth large shake table.

“Our facility is unique worldwide and, combined with the excellence of our faculty and students, will allow us to make even greater contributions to the seismic safety of our state, the nation and the world,” Manos Maragakis, dean of the College of Engineering, said. “We will test new designs and materials that will improve our homes, hospitals, offices and highway systems. Remarkable research is carried on here. Getting to this point has taken a lot of hard work. It’s both a culmination and a beginning, ushering in a new era.”

Source: University of Nevada, Reno

 A 25-year-long study published in GEOLOGY on 14 July provides the first quantitative measurement of in situ calcium-magnesium silicate mineral dissolution by ants, termites, tree roots, and bare ground. This study reveals that ants are one of the most powerful biological agents of mineral decay yet observed. It may be that an understanding of the geobiology of ant-mineral interactions might offer a line of research on how to “geoengineer” accelerated CO2 consumption by Ca-Mg silicates.

Researcher Ronald Dorn of Arizona State University writes that over geological timescales, the dissolution of calcium (Ca) and magnesium (Mg) bearing silicates has led to the graduate drawdown of atmospheric carbon dioxide (CO2) through the accumulation of limestone and dolomite. Many contemporary efforts to sequester CO2 involve burial, with some negative environmental consequences.

Dorn suggests that, given that ant nests as a whole enhance abiotic rates of Ca-Mg dissolution by two orders of magnitude (via biologically enhanced weathering), future research leading to the isolation of ant-based enhancement process could lead to further acceleration. If ant-based enhancement could reach 100 times or greater, he writes, this process might be able to geo-engineer sequestration of CO2 from the atmosphere. Similarly, ants might also provide clues on geoengineering efficient pathways of calcium carbonate precipitation to sequester atmospheric CO2.

Earth’s climate has cooled significantly over the past 65 m.y., likely from hydrologic regulation, vegetation change, and interactions related to tectonism, in part mediated by Ca-Mg silicate mineral dissolution that draws down CO2. Although speculative, says Dorn, the timing of the expansion in the variety and number of ants in the Paleogene and the Neogene suggests that biologically enhanced weathering by ants could potentially be a part of the puzzle of Cenozoic cooling.

**FEATURED ARTICLE**
Ants as a powerful biotic agent of olivine and plagioclase dissolution
Ronald I. Dorn, School of Geographical Sciences and Urban Planning, Arizona State University, Tempe, Arizona 85287-5302, USA. Published online 14 July 2014; http://dx.doi.org/10.1130/G35825.1.

Other GEOLOGY articles (see below) cover such topics as
1. Earth-like soils on Mars;
2. Glaciation on Mars; and
3. Weathering of the Western Wall in Jerusalem.

GEOLOGY articles published online ahead of print can be accessed online at http://geology.gsapubs.org/content/early/recent. All abstracts are open-access at http://geology.gsapubs.org/; representatives of the media may obtain complimentary articles by contacting Kea Giles at the address above.

No-wait data centers

New system could reduce data-transmission delays across server farms by 99.6 percent

Big websites usually maintain their own “data centers,” banks of tens or even hundreds of thousands of servers, all passing data back and forth to field users’ requests. Like any big, decentralized network, data centers are prone to congestion: Packets of data arriving at the same router at the same time are put in a queue, and if the queues get too long, packets can be delayed.

At the annual conference of the ACM Special Interest Group on Data Communication, in August, MIT researchers will present a new network-management system that, in experiments, reduced the average queue length of routers in a Facebook data center by 99.6 percent — virtually doing away with queues. When network traffic was heavy, the average latency — the delay between the request for an item of information and its arrival — shrank nearly as much, from 3.56 microseconds to 0.23 microseconds.

Like the Internet, most data centers use decentralized communication protocols: Each node in the network decides, based on its own limited observations, how rapidly to send data and which adjacent node to send it to. Decentralized protocols have the advantage of an ability to handle communication over large networks with little administrative oversight.

The MIT system, dubbed Fastpass, instead relies on a central server called an “arbiter” to decide which nodes in the network may send data to which others during which periods of time. “It’s not obvious that this is a good idea,” says Hari Balakrishnan, the Fujitsu Professor in Electrical Engineering and Computer Science and one of the paper’s coauthors.

With Fastpass, a node that wishes to transmit data first issues a request to the arbiter and receives a routing assignment in return. “If you have to pay these maybe 40 microseconds to go to the arbiter, can you really gain much from the whole scheme?” says Jonathan Perry, a graduate student in electrical engineering and computer science (EECS) and another of the paper’s authors. “Surprisingly, you can.”

Division of labor

Balakrishnan and Perry are joined on the paper by Amy Ousterhout, another graduate student in EECS; Devavrat Shah, the Jamieson Associate Professor of Electrical Engineering and Computer Science; and Hans Fugal of Facebook.

The researchers’ experiments indicate that an arbiter with eight cores, or processing units, can keep up with a network transmitting 2.2 terabits of data per second. That’s the equivalent of a 2,000-server data center with gigabit-per-second connections transmitting at full bore all the time.

“This paper is not intended to show that you can build this in the world’s largest data centers today,” Balakrishnan says. “But the question as to whether a more scalable centralized system can be built, we think the answer is yes.”

Moreover, “the fact that it’s two terabits per second on an eight-core machine is remarkable,” Balakrishnan says. “That could have been 200 gigabits per second without the cleverness of the engineering.”

The key to Fastpass’s efficiency is a technique for splitting up the task of assigning transmission times so that it can be performed in parallel on separate cores. The problem, Balakrishnan says, is one of matching source and destination servers for each time slot.

“If you were asked to parallelize the problem of constructing these matchings,” he says, “you would normally try to divide the source-destination pairs into different groups and put this group on one core, this group on another core, and come up with these iterative rounds. This system doesn’t do any of that.”

Instead, Fastpass assigns each core its own time slot, and the core with the first slot scrolls through the complete list of pending transmission requests. Each time it comes across a pair of servers, neither of which has received an assignment, it schedules them for its slot. All other requests involving either the source or the destination are simply passed on to the next core, which repeats the process with the next time slot. Each core thus receives a slightly attenuated version of the list the previous core analyzed.

Bottom line

Today, to avoid latencies in their networks, most data center operators simply sink more money into them. Fastpass “would reduce the administrative cost and equipment costs and pain and suffering to provide good service to the users,” Balakrishnan says. “That allows you to satisfy many more users with the money you would have spent otherwise.”

Networks are typically evaluated according to two measures: latency, or the time a single packet of data takes to traverse the network, and throughput, or the total amount of data that can pass through the network in a given interval.

Source: Massachusetts Institute of Technology

Barcelona in 2 minutes

Rob Whitworth, a director of short films, worked for more than 360 hours to capture a video of two minutes, in Barcelona.