Scientists make singlephoton sources brighter

first_img A research team at the University of California at Santa Barbara (UCSB), headed by Prof. Dirk Bouwmeester (Physics), Prof. Pierre Petroff (Materials and ECE) and Prof. Larry Coldren (ECE), has created a robust micron-sized semiconductor device that can emit single photons on demand at a rate of 31 million photons per second (31 MHz) into an optical fiber, five times better than previously possible. The scientists’ study makes the cover of the Nature Photonics December issue. In the publication, the researchers describe how they developed a single-photon source based on a semiconductor quantum dot that is structurally robust and provides record-high light extraction efficiencies up to 38%. Several innovations contributed to the higher performance, including a new microcavity design, embedded oxide apertures, and electrical gates to control the emission rate of the active quantum dot material.“Combining all these tricks in a properly designed semiconductor device enabled us to measure five million single photons per second, corresponding to a net single-photon generation rate of 31 MHz,” Stefan Strauf, Assistant Professor at the Stevens Institute of Technology in New Jersey, told PhysOrg.com. “Furthermore, we have recorded single-photon signatures at rates up to 116 MHz, but only under continuous, rather than pulsed, laser excitation due to technical limitations, where the ‘on-demand’ character is lost. Nevertheless, these experiments demonstrate the potential for further improvements of the devices.” The researchers’ design was built on a common technique for generating single photons: using an optical microcavity to harvest the light emitted by an individual quantum dot. When light is squeezed into a small space around a quantum dot with the help of an optical cavity, the electrical carriers inside the quantum dot are forced to emit light more rapidly. To confine the light very strongly, microcavities with very precise geometries are necessary. In the past, scientists have etched tiny micropillars with sub-micrometer dimensions to confine the light, but the downside is that the etching causes rough sidewalls that scatter the light away. Instead of etching tiny micropillars, the scientists etched large trenches leaving cavities with diameters of about 20 microns, and used embedded aluminum-oxide apertures to confine the light field into a tiny space, successfully avoiding scattering losses. It turned out that the aperture design is also mechanically stronger than the brittle pillars, enabling the scientists to attach electrical contacts to control the emission of the quantum dots (due to the Purcell effect), and further improving the photon generation rate. The new design also opened possibilities to explore geometries of the etched trenches forming the cavity (see the cover illustration of Nature Photonics). This geometry also influences the shape of the confined light field inside the cavity and thus dictates the polarization of the emitted light field. With the help of the electrical contacts, the researchers demonstrated voltage control of the single photon polarization. This ability is important for quantum cryptography, enabling the zeros and ones of the binary code to be encoded in the photons’ polarization.One of the next improvements was the scientists’ ability to prohibit “dark states” from forming, something which has never been achieved before. When an electron and a hole are captured by a quantum dot, they form excitons that have a spin of either one or two. As the scientists explained, a spin of one allows the exciton to couple to the optical field and recombine, but an exciton with a spin of two cannot fire, and forms a “dark state.”“You want to avoid these so-called dark states from forming,” Strauf explained. “And you can do this by preloading the quantum dots with a single electron. This way, after the capture of an electron-hole pair, there are two electrons, which must have opposite spin owing to the Pauli principle. This means that the electron always finds a bright recombination path.”The new single-photon source marks a significant step toward a practical device, in an area where progress has been rapid. As Strauf notes, just five years ago it took eight hours to record a decent single-photon signature from a semiconductor. Compare that with the current timescale of milliseconds to record a single-photon signature, and the odds are high that single-photon sources will continue to improve. Ultimately, the rate of single-photon generation is limited by the lifetime of the dot’s excited state – about 0.1 to 1 nanosecond, which corresponds to a 1 to 10 GHZ range. These rates are yet to be seen, however.Besides achieving a high generation rate, quantum applications will also require other improvements. Currently, UCSB’s single-photon sources operate at 950-nanometer wavelengths and at cryogenic temperatures, but practical quantum cryptography will require telecommunication wavelengths and operation at room temperature. Nevertheless, the researchers are hopeful that single-photon sources will one day provide the technology for many interesting applications.More information: Strauf, Stefan, Stoltz, Nick G., Rakher, Matthew T., Coldren, Larry A., Petroff, Pierre M., and Bouwmeester, Dirk. “High-frequency single-photon source with polarization control.” Nature Photonics, Vol. 1, December 2007, 704-708.Copyright 2007 PhysOrg.com. All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com. Scientists have achieved a major advance in developing a single-photon light source, bringing quantum applications such as quantum computing and quantum cryptography closer to reality. Citation: Scientists make single-photon sources brighter (2007, December 12) retrieved 18 August 2019 from https://phys.org/news/2007-12-scientists-single-photon-sources-brighter.htmlcenter_img This electron micrograph image of the single-photon sources developed at UCSB shows the etched trenches that leave behind large, 20-micron cavities. The different geometries influence the shape of the confined light field inside the cavity and thus dictate the polarization of the emitted photons. Image credit: Stefan Strauf, et al. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Details

Asus Eee PC Netbook 1000HG WiMAX Coming Soon Video

first_imgHuawei’s high performance EM770 embedded module is one of the world’s first 3.5G embedded modules developed for Windows 7. This enables the Eee PC’s 3.5G connectivity along with Windows 7 to provide anytime, anywhere connectivity at home, work, and on the go.The Asus 1000H series PC netbooks incorporates a 1.6GHz Intel Atom N270 CPU, 1GB of RAM, a 160GB hard drive, 802.11b/g WiFi and Bluetooth, a 6 cell battery, and a built in 3.5G modem that can connect to HSDPA with speeds of up to 7.2Mbit and HSUPA with connect speeds of up to 5.76Mbit. The modem also supports EDGE and GPRS if you’d find yourself outside of a 3G coverage area. The Asus Eee PC 1000HG netbook is selling in Taiwan for about $565 US. There has been no release date yet has to when it will be available in the west. © 2009 PhysOrg.com Citation: Asus Eee PC Netbook 1000HG WiMAX Coming Soon (Video) (2009, February 19) retrieved 18 August 2019 from https://phys.org/news/2009-02-asus-eee-pc-netbook-1000hg.html (PhysOrg.com) — At the Mobile World Congress this week, Asus has announced their up coming 1000HG netbook that comes with both WiMAX and Wi-Fi capabilities. Asus has made no commitment to a release date but stated that it’s part of their Anywhere Connectivity announcements this week at Mobile World Congress. Is Facebook listening to me? Why those ads appear after you talk about things This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Explore further Asus has also been showing a few other Eee PC models with integrated 3.5G wireless, using a Huawei EM770 WWAN module and Windows 7. These Eee PCs will be among the world’s first to boast 3.5G capabilities designed around Windows 7 mobile broadband architecture.last_img read more

Details

Bean bugs found to harbor bacteria that keep them safe from an

first_img To find out what was going on with bean bugs and the Burkholderia bacteria, the researchers added the bacteria to potting soil in the lab, where they flourished. They followed that by adding fenitrothion, which the bugs ate with abandon. Next, they introduced some young bean bugs (nymphs) into the pot which ate soy bean seedlings the researchers added to the mix.In examining the guts of the bugs, the researchers found the bacteria continued to thrive and the bugs became immune to the effects of the insecticide as a result because the bacteria was eating it before it could harm them. Normally, they say, up to eighty percent of bean bugs will die from such an exposure.In further tests, the researchers found that bean bugs can harbor up to a hundred million bacteria in their guts, which tends to make them larger than others of the same species.Fortunately for farmers in Japan, however, it doesn’t appear that many of the bean bugs, or their close cousin chinch bugs, swallow much of the bacteria in the wild though. Tests done found that only eight percent of such bugs had Burkholderia bacteria in their guts in one area, and none in another, thus very few were able to develop an immunity to fenitrothion.The research team says that this symbiotic relationship between bean bugs and Burkholderia bacteria, providing the bugs with immunity from an insecticide, is the first such example ever found. But they also note that because it’s been found in this case, it’s likely occurring in other relationships as well. Journal information: Proceedings of the National Academy of Sciences Citation: Bean bugs found to harbor bacteria that keep them safe from an insecticide (2012, April 25) retrieved 18 August 2019 from https://phys.org/news/2012-04-bean-bugs-harbor-bacteria-safe.html © 2012 Phys.Org Explore further (Phys.org) — Conventional wisdom says that in order for a species of insect to develop resistance to an antibiotic, several generations have to pass, whereby genes from those that have some natural resistance pass them on to their offspring. But sometimes conventional wisdom fails to take into account how some bugs can find a work around. In this case, it’s the bean bug. Researchers in Japan have found that for Riptortus pedestris, the common bean bug, there is a much quicker path. All they have to do is ingest the Burkholderia bacteria. Doing so, the team says in their paper published in the Proceedings of the National Academy of Sciences, makes them nearly impervious to the insecticide fenitrothion, which has historically been used to treat soy bean plants to protect them from the bugs that dine on them. center_img More information: Symbiont-mediated insecticide resistance, PNAS, Published online before print April 23, 2012, doi: 10.1073/pnas.1200231109AbstractDevelopment of insecticide resistance has been a serious concern worldwide, whose mechanisms have been attributed to evolutionary changes in pest insect genomes such as alteration of drug target sites, up-regulation of degrading enzymes, and enhancement of drug excretion. Here, we report a previously unknown mechanism of insecticide resistance: Infection with an insecticide-degrading bacterial symbiont immediately establishes insecticide resistance in pest insects. The bean bug Riptortus pedestris and allied stinkbugs harbor mutualistic gut symbiotic bacteria of the genus Burkholderia, which are acquired by nymphal insects from environmental soil every generation. In agricultural fields, fenitrothion-degrading Burkolderia strains are present at very low densities. We demonstrated that the fenitrothion-degrading Burkholderia strains establish a specific and beneficial symbiosis with the stinkbugs and confer a resistance of the host insects against fenitrothion. Experimental applications of fenitrothion to field soils drastically enriched fenitrothion-degrading bacteria from undetectable levels to >80% of total culturable bacterial counts in the field soils, and >90% of stinkbugs reared with the enriched soil established symbiosis with fenitrothion-degrading Burkholderia. In a Japanese island where fenitrothion has been constantly applied to sugarcane fields, we identified a stinkbug population wherein the insects live on sugarcane and ≈8% of them host fenitrothion-degrading Burkholderia. Our finding suggests the possibility that the symbiont-mediated insecticide resistance may develop even in the absence of pest insects, quickly establish within a single insect generation, and potentially move around horizontally between different pest insects and other organisms. Burkholderia pseudomallei colonies on a Blood agar plate. Image credit: CDC/Courtesy of Larry Stauffer, Oregon State Public Health Laboratory (PHIL #1926), 2002. Repulsive smell could combat bed bugs This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Details

Study of moon rocks shows barrage 4 billion years ago was mainly

first_img Explore further Earth’s makeup found to be more diverse than previously thought (Phys.org) — Researchers have known for some years that the Earth and moon were subjected to a veritable barrage of objects striking their surfaces nearly four million years ago, but less certain was whether those objects were asteroids, comets or even pieces of other protoplanets after they broke apart. Now however, new research by a group of lunar scientists has found, after studying moon rocks brought back by astronauts during the Apollo 16 mission, that it appears they were mostly asteroids. But not, they write in their paper published in the journal Science, the same kind as we see falling on our planet today. Citation: Study of moon rocks shows barrage 4 billion years ago was mainly asteroids (2012, May 18) retrieved 18 August 2019 from https://phys.org/news/2012-05-moon-barrage-million-years-asteroids.html Journal information: Science The researchers, led by Katherine Joy, looked at specific types of moon rocks known as regolith breccias, which are in essence dirt balls with embedded fragments of rocks and other debris from impacts. They are believed to have formed somewhere around three and a half billion years ago, which was close to the time of the great barrage. To find out more about the fragments, they put samples in an electron microscope and also used other micro-probing techniques to get a closer look. In so doing they found that many of the fragments were of nearly the same type as carbonaceous chondrite meteorites, which come from certain types of asteroids.They also found a certain uniformity in the samples that is not present in samples from meteorites that have impacted the moon in more recent times, which the researchers write, suggests that such rocks striking the moon during the barrage were somewhat different from those that strike today which are quite diverse. They also found that the fragments found in different regolith breccias were sufficiently different from each other to rule out the possibility of them coming from a protoplanet that broke apart.That leaves asteroids as the most likely kind of object striking both the Earth and the moon during the barrage which other scientists have suggested occurred due to a relatively sudden change in the distance between the planets in the early solar system. The suspicion is that all or most of the planets formed in rather close proximity to the sun, then slowly moved farther away. If that was the case, then changes in gravitational effects caused by the planets would likely have had a profound impact on other bodies moving around, causing many of them perhaps, to run into one another and the planets. Some even suggest the bombardment that resulted could have been a major contributing factor to the development of life here on Earth, which many believe occurred right around the same time. More information: Direct Detection of Projectile Relics from the End of the Lunar Basin–Forming Epoch, Science DOI: 10.1126/science.1219633ABSTRACTThe lunar surface, a key proxy for the early Earth, contains relics of the asteroids and comets that have pummeled terrestrial planetary surfaces. Surviving fragments of projectiles in the lunar regolith provide a direct measure of the types and, thus, sources of exogenous material delivered to the Earth-Moon system. In ancient [>3.4 billion years ago (Ga)] regolith breccias from the Apollo 16 landing site we located mineral and lithologic relics of magnesian chondrules from chondritic impactors. These ancient impactor fragments are not nearly as diverse as those found in younger (3.4 Ga to today) regolith breccias and soils from the Moon, or that presently fall as meteorites to Earth. This suggests that primitive chondritic asteroids, originating from a similar source region, were common Earth-Moon-crossing impactors during the latter stages of the basin forming epoch. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. © 2012 Phys.Orglast_img read more

Details

Wolfram Alpha expands Facebook analytics

first_img(Phys.org)—Wolfram Alpha, the computational search engine has announced a major upgrade to its Personal Analytics for Facebook. Now instead of a few basic facts about a user’s Facebook page, those who use the engine on their own page can gain new insights into the various ways people are related to them on the social networking site. © 2013 Phys.org This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. More Facebook friends means more stress, says report At this time, Wolfram Alpha offers a standard free version with an option to upgrade to one with more complex analytics. The free version allows users to view data about themselves based on information given on their page, as well as information about those people listed as friends. The upgrade has focused mainly on new ways to look at the ties that bind friends together.To help users gain a deeper perspective on their listed friends, engineers at Wolfram Alpha have come up with unique classifications for them: Insider (friends with mutual friends), Outsider (friends with few mutual friends), Gateway (friends with links to other groups), Neighbor (friends with few friends outside a local network) and Connector (friends that tie groups of friends together). The application displays information about the different classifications using charts and graphs that help to visualize links that may not be obvious when simply looking at a list of friends. Creating a doted landscape on a blank slate color coded by classification for example, highlights islands of friends or associates (work friends, versus social friends, etc.) and reveals those friends that link them together. Another lists each category – with names of associated friends listed in each. Yet another displays a map of the world with dots indicating the locations of friends, with an associated report that highlights information such as which friend is farthest away, how many live in different countries, who lives at the highest elevation, etc.Wolfram Alfa has also added a few new features to the self analysis part of the engine which entails offering more information about how a user uses Facebook. Users can see what time of day they generally post to their wall for example, or perhaps most revealing view a word cloud, which is where words are listed by size and color based on how often they are used when posting.Also announced is a new feature that users can activate if they choose that will begin logging historical information which will mean the collection of data on the fly – such as time spent online, pages visited, etc., for the purpose of creating charts and graphs offering new ways for users to see how they are using Facebook.center_img Citation: Wolfram Alpha expands Facebook analytics (2013, January 25) retrieved 18 August 2019 from https://phys.org/news/2013-01-wolfram-alpha-facebook-analytics.html More information: blog.wolframalpha.com/2013/01/ … lytics-for-facebook/ Explore furtherlast_img read more

Details

Facebook looking for meaning in user posts with deep learning algorithms

first_img Citation: Facebook looking for meaning in user posts with ‘deep learning’ algorithms (2013, September 23) retrieved 18 August 2019 from https://phys.org/news/2013-09-facebook-user-deep-algorithms.html © 2013 Phys.org As anyone who uses Facebook knows, friending people means adding their posts to your personal newsfeed. As the number of friends grows, so too does the number of newsfeed entries. Eventually, a point is reached where it becomes untenable. To deal with this problem, Facebook has created algorithms that are meant to pick out what it believes are the most relevant posts and only send those to the newsfeed, rather than deliver them all. Thus far, this approach has met with mixed reviews from its user community. The company is hoping that giving its algorithms more smarts, will help improve the quality of newsfeeds.Deep learning isn’t something Facebook created, it’s an idea that has been around for several years, Microsoft and Google have used it (Google most famously to help identify cats in YouTube videos). IBM uses it too, as part of Watson, the supercomputer that beat Jeopardy champions on television. Netflix is experimenting with similar technology to help improve its movie suggestion algorithms and might serve as a guide for how successful Facebook might expect to be with its initiative (movie suggestions are still not all that relevant). But perhaps, that’s not the point at all. Instead, maybe it’s the process that is the real news. Big companies are starting to spend a lot of money on neural networks with the idea of a big payoff. Surely Facebook would be happy if they could improve their newsfeed, but the real motive, as with any big business, will always be about improving the bottom line. And if deep learning algorithms can figure out a way to coax users into clicking on more ads, than the initiative will most certainly be deemed a success by all concerned. More information: via TechReview (Phys.org) —Officials at Facebook have apparently decided to get serious about making sense of posts by its vast user base—according to MIT’s Technology Review, officials with the company (specifically Chief Technical Officer Mike Schroepfer) have announced that they have put together a team of eight professionals with the mission of developing what the software industry has begun calling “deep learning.” Deep learning is a type of software programming where algorithms are created that allow for building simulated neural networks. Such neural networks are capable of “learning” by analyzing patterns over time. Facebook, TR reports, is hoping to use its algorithms to better target ads, and also to improve its newsfeed.center_img Explore further Study examines viral reach of hashtags on Facebook This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Details

Apple is granted hover and heartrate monitoring patents

first_img Apple wins multi-touch and glass process patents This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The touch and hover patent involves devices that can recognize something like a finger or stylus hovering above a display panel without actually touching the screen. The name of the patent is “Touch and hover signal drift compensation” and it describes a system in which a touchscreen display can accurately determine both hover and touch events.According to the patent, “Some touch sensitive devices can also recognize a hover event, i.e., an object near but not touching the touch sensor panel, and the position of the hover event at the panel. The touch sensitive device can then process the hover event in a manner similar to that for a touch event, where the computing system can interpret the hover event in according with the display appearing at the time of the hover event, and thereafter can perform one or more actions based on the hover event.”Brian Michael King, Omar Leung, Paul G. Puskarich, Jeffrey Traer Bernstein, Andrea Mucignat, Avi E. Cieplinski, Muhammad U. Choudry, Praveen R. Subramani, Marc J. Piche, David T. Amm and Duncan Robert Kerr were named in the patent as inventors.Another patent issued is “Seamlessly embedded heart rate monitor.” This presents an electronic device having an integrated sensor for detecting a user’s cardiac activity and cardiac electrical signals. The patent was filed in 2009 and named Gloria Lin, Taido Nakajima, Pareet Rahul, and Andrew Hodge as the patent’s inventors. Here is where it gets interesting: an application might be one of a biometric tool for secure identification or for reading moods. According to the patent, “Using the detected signals, the electronic device can identify or authenticate the user and perform an operation based on the identity of the user. In some embodiments, the electronic device can determine the user’s mood from the cardiac signals and provide data related to the user’s mood.”The patent noted also that “electrical signals generated by the user can be transmitted from the user’s skin through the electronic device housing to the leads.” In talking about biometric-based approaches to authenticate a user, the patent explained that “In some embodiments, an electronic device can authenticate a user based on the attributes of the user’s heartbeat. For example, the durations of particular portions of a user’s heart rhythm, or the relative size of peaks of a user’s electrocardiogram (EKG) can be processed and compared to a stored profile to authenticate a user of the device.”AppleInsider noted that “The latest Apple patent is evidence that the company is actively investigating deployable biometric security solutions that rely on users’ bodies rather than stored or memorized codes, one such system being the Touch ID fingerprint reader.” Citation: Apple is granted hover and heart-rate monitoring patents (2013, December 26) retrieved 18 August 2019 from https://phys.org/news/2013-12-apple-granted-heart-rate-patents.html More information: — Touch and hover signal drift compensation— Seamlessly embedded heart rate monitorcenter_img © 2013 Phys.org Explore further (Phys.org) —Apple has been awarded patents that include one for an accurate touch and hover panel, originally filed back in 2010, and another for an embedded heart rate monitor, originally filed in 2009. Details of the patents were reported by AppleInsider.last_img read more

Details

Electron spin changes as a general mechanism for general anesthesia

first_img Explore further Lipid solubility appears to be one key clue to anesthesia. The empirical cornerstone of anesthesiology is a 100 year old rule of thumb known as the Meyer-Overton relationship. It provides that the potency of general anesthetics (GAs), regardless of their size or structure, is approximately proportional to how soluble they are in lipids. Since that time, studies have suggested that GAs can also bind to lipid-like parts of proteins, presumably those near or embedded within cell membranes. The first real stab at explaining the “how” of anesthetics, as opposed to just the “where”, has now been taken by Turin and his colleagues Efthimios Skoulakis and Andrew Horsfield. Their new work, just published in PNAS, suggests that volatile anesthetics operate by perturbing the internal electronic structure of proteins. This would lead to changes in electron currents in those proteins, in cells, and in the organism. They don’t just theorize about these effects, they actually measure the electron currents in anesthetized flies using a technique known as electron spin resonance (often called electron paramagnetic resonance). ESR is similar to nuclear magnetic resonance, the techno-phenomenon at heart of the modern MRI machine. The main difference is that in ESR excited electron spins are measured instead of proton resonance. Typically, microwaves are applied in the presence of a magnetic field to a sample (or whole organism) inside the resonator cavity of an ESR spectrometer. An ESR signal is diagnostic of unpaired electrons, which exist only in certain cellular structures. One particularly strong signal for example, is that of melanin, which can be accounted for in experiments by comparisons with mutants lacking normal melanin content.What Turin and colleagues have shown is that the total amount of free electron spins in fruit flies increases when they are exposed to general anaesthetics. The amount of free spins generated during anesthesia is independent of melanin content and far larger than any signal previously measured from free radicals which are the other source of spin. These are normally very unstable and undetectable in the absence of “spin traps” to capture them. Furthermore, mutants of Drosophila which have been selected for resistance to certain anesthetics show a reduced, sometimes absent spin signal. Citation: Electron spin changes as a general mechanism for general anesthesia? (2014, August 11) retrieved 18 August 2019 from https://phys.org/news/2014-08-electron-mechanism-anesthesia.html Journal information: Proceedings of the National Academy of Sciences This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. (Phys.org) —How does consciousness work? Few questions if any could be more profound. One thing we do know about it, jokes biophysicist Luca Turin, is that it is soluble in chloroform. When you put the brain into chloroform, the lipids that form nerve cell membranes and the myelin that insulates them will dissolve. On the other hand, when you put chloroform into the brain, by inhaling it, consciousness dissolves. It is hard to imagine a satisfying explanation of consciousness that does not also account for how anesthetics like chloroform can abolish it. Effect of the anesthetic noble gas Xenon on the electronic structure of two short peptides. Top: the highest occupied molecular orbital [HOMO, purple surface ] for two 9-residue helices positioned close to each other. A small fraction of the HOMO extends from one helix to the other. Bottom: when a Xenon atom [gold sphere] is in the gap, the orbital spread increases. Transparent surface is Van der Waals electron density. Credit Luca Turin More information: Electron spin changes during general anesthesia in Drosophila, PNAS, www.pnas.org/cgi/doi/10.1073/pnas.1404387111 Common brain processes of anesthetic-induced unconsciousness identified © 2014 Phys.org Why did Turin and his musketeers try the experiment in the first place? Some of Turin’s most alluring science has been a side effect of his passion for perfume. While not intending to become the fly whisperer that he is today, Turin was able to use these creatures to demonstrate detection of odorants by molecular vibrations. The key mechanism here, and link to anesthetics, is the concept of inelastic electron tunneling, i.e, an electron current that takes place within the receptor proteins in the presence of odorants. To account for the fact that a very broad class of compounds act as volatile anesthetics the researchers propose a unitary mechanism for their action involving electrons. They note that the smallest among them, Xenon (Xe), presents a puzzle to chemical theories of anesthetic action. Xe is a wonderful (if expensive) anesthetic but it has no biologically relevant chemistry to speak of— it is completely inert. Furthermore, it persists as a perfect sphere of electron density and so is devoid of any possibly interesting shape. However, as Turin and colleagues point out, “Xe has physics”. In particular, it can conduct electrons, as the IBM researchers who first used a scanning tunneling microscope to write the company’s logo in Xe atoms found out.To see whether this property would apply to all anesthetics, and not just Xe, Turin used a modeling technique called density functional theory to show that Xe and other anesthetics effect the highest occupied molecular orbit (HOMO) of the alpha helices common to membrane proteins. The HOMO level for organic molecules or semiconductors is analogous to what the valence band maximum is to inorganic semiconductors. Intriguingly, while all the anesthetics were found to extend the alpha helix HOMO level, similar molecules with strong convulsant effects on the brain, but no anesthetic effects, had the smallest HOMO effect. These results offer a fascinating insight into how anesthetics may be operating and raise many important new questions. Would spin changes be able to explain, for example, the observation that deeply anesthetized tadpoles (a favorite animal model in anesthesia research) can be quickly returned to normal activity just by subjected them to a sobering pressure pulse of 50 bars? Are the cessation of consciousness and the apparent concomitment abolishment of spikes both mere epiphenomena of underlying material reorganizations that result from spin changes? In other words, anesthetics may eliminate the wherewithall for spikes but is that the effect that is really eliminating the conscious state?Other researchers, in particular those who investigate the solitary acoustic wave nature of spikes, report that the melting point of membranes is lowered by anesthetics while hydrostatic pressure increases it—ostensibly due to latent volume changes. A rectification of these more global thermodynamic intuitions with lower level physics and chemistry of electron conduction awaits. The work of Turin and his collegues breathes refreshing new life into a field whose increasingly beleaguered explanations of yore (like simplistic effects on ion channels) have now started to crumble under the weight of their own exceptions.last_img read more

Details

Physicists find ways to increase antihydrogen production

first_imgAntihydrogen consists of an antiproton and a positron. Credit: public domain © 2015 Phys.org Explore further Journal information: Physical Review Letters CERN experiment produces first beam of antihydrogen atoms for hyperfine study (Phys.org)—There are many experiments that physicists would like to perform on antimatter, from studying its properties with spectroscopic measurements to testing how it interacts with gravity. But in order to perform these experiments, scientists first need some antimatter. Of course, they won’t be finding any in nature (due to antimatter’s tendency to annihilate in a burst of energy when it comes in contact with ordinary matter), and creating it in the lab has proven to be very technically challenging for the same reasons. More information: A. S. Kadyrov, et al. “Antihydrogen Formation via Antiproton Scattering with Excited Positronium.” Physical Review Letters. DOI: 10.1103/PhysRevLett.114.183201 Citation: Physicists find ways to increase antihydrogen production (2015, May 20) retrieved 18 August 2019 from https://phys.org/news/2015-05-physicists-ways-antihydrogen-production.html In the new study, the scientists theoretically showed that antiproton collisions with positronium in an excited state instead of the ground state can enhance antihydrogen production significantly, particularly at the lower energies.”Our calculations show that a very efficient way of producing antihydrogen is to bring together slow antiprotons with positronium, which has been prepared in an excited state, something that is now routine using lasers,” Kadyrov said. “It turns out antihydrogen formation increases by several orders of magnitude for positronium in excited states as compared to the ground state due to unexpected low-energy behavior revealed in our calculations.”For the first time, these theoretical results allow for realistic estimates of antihydrogen formation rates via antiproton-positronium scattering at low energies. Because lower energies are more important in experiments than higher energies, the scientists hope that this method will offer a practical way to create cold antihydrogen, which could then be used to test the fundamental properties of antimatter.”Scientists from the ALPHA, ATRAP, AEgIS and GBAR Collaborations at CERN are working on producing and trapping antihydrogen in sufficient quantities for experiments on the spectroscopic and gravitational properties of antihydrogen,” Kadyrov said. “We believe that the efficient mechanism for antihydrogen formation that our research has unveiled could be used to facilitate these investigations.”The scientists plan to investigate this antihydrogen production mechanism more in the future, with the goal to achieve even better results.”Presently, positronium can be excited to high-energy states, known as Rydberg states,” Kadyrov said. “Next we want to investigate antiproton collisions with positronium in such a state. Given the magnitude of the enhancement we have got for the lower excited states, one can expect that the corresponding enhancement would be enormous. This then could open a very promising way of producing low-energy antihydrogen beams for spectroscopic experiments, for example, for measurements of hyperfine splitting in antihydrogen.” Now in a new paper published in Physical Review Letters, Alisher S. Kadyrov, et al., at Curtin University in Perth, Australia, and Swansea University in the UK, have theoretically found a method to enhance the rate of antihydrogen production by several orders of magnitude. They hope that their finding will guide antihydrogen programs toward achieving the production of large amounts of antihydrogen for long confinement times, and at cool temperatures, as required by future investigative experiments.”Laws of physics predict equal amounts of matter and antimatter created after the Big Bang,” Kadyrov, Associate Professor at Curtin University, told Phys.org. “One of science’s mysteries is where did all the antimatter go? To unravel this mystery, scientists at CERN [the European Organization for Nuclear Research] plan to do gravitational and spectroscopic experiments with antimatter. The simplest example is antihydrogen. However, it is challenging and expensive to create and study antihydrogen in the laboratory.”Antihydrogen is an appealing form of antimatter for scientists to study in part because it is electrically neutral: it consists of an antiproton (a negatively charged proton) and a positron or antielectron (a positively charged electron). Because it’s made of just two antiparticles, antihydrogen is also somewhat easier to produce than larger antiatoms. In 2002, scientists produced antihydrogen in the first dedicated antihydrogen production experiment at CERN, and in 2010 they confined antihydrogen in traps for up to 30 minutes. Eventually, however, the antihydrogen annihilates, such as by impacting the walls of the experimental apparatus or interacting with background gases.There are a few different ways to produce antihydrogen in the lab, all of which involve colliding or scattering particles off one another. In the new study, the physicists focused on the reaction in which an antiproton is scattered off positronium, which is a bound state consisting of a positron and an ordinary electron. In a sense, positronium can be thought of as a hydrogen atom in which the proton is replaced by a positron. So far, the antiproton-positronium scattering reaction has been investigated mostly when the positronium is in its ground state. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Details

Learning how to get computers to develop new and useful materials

first_img Citation: Learning how to get computers to develop new and useful materials (2016, May 5) retrieved 18 August 2019 from https://phys.org/news/2016-05-materials.html Discovering new materials for use in solving problems or to create new types of structures or devices, is notoriously difficult work—most in the field would describe it has haphazard, with most new discoveries coming about at least partly by chance. Most often the process involves clearly defining a problem, e.g. noting that a certain type of battery should be able to hold a charge longer, than looking at all of the materials that have been discovered so far that fall into a certain category to see if any of them might fill the bill, and if that does not work, striking out into the unknown to see if there is a material that exists naturally in the world that has not yet been identified as a possibility. If that fails, the next step is to see if a new material can be made by combining other materials under various conditions, a process so fraught with difficulties that most simply do not bother, hoping that someone will stumble across a solution by accident sometime in the near future.But, Nosengo points out, things do not have to go this way, why not use computers to do the looking for us, he asks, or perhaps even better, get them to discover new materials for us by virtual combining ingredients and virtually subjecting them to different conditions. Scientists are working on this idea, he notes—starting with building databases that hold information about the basic properties of already known materials, all subdivided into classes, such as those that have crystal structures (useful in battery making). He notes also that several groups have been working on developing algorithms to use such data, such as one called simply Intelligent Search. He notes also that the White House got involved back in 2011, by backing an initiative called the Materials Genome Initiative—which is based on the ideas used with the Human Genome Project approach.As one example of an actual project, the team at Haverford demonstrated in their paper a new approach to developing algorithms to allow computers to use reaction data to predict reaction outcomes—a very necessary component of any large system dedicated to creating new materials out of basic components without guidance from humans. © 2016 Phys.org Explore further Journal information: Nature ‘Natural selection’ could lead to amazing new materials More information: Paul Raccuglia et al. Machine-learning-assisted materials discovery using failed experiments, Nature (2016). DOI: 10.1038/nature17439AbstractInorganic–organic hybrid materials such as organically templated metal oxides, metal–organic frameworks (MOFs) and organohalide perovskites have been studied for decades, and hydrothermal and (non-aqueous) solvothermal syntheses have produced thousands of new materials that collectively contain nearly all the metals in the periodic table. Nevertheless, the formation of these compounds is not fully understood, and development of new compounds relies primarily on exploratory syntheses. Simulation- and data-driven approaches (promoted by efforts such as the Materials Genome Initiative) provide an alternative to experimental trial-and-error. Three major strategies are: simulation-based predictions of physical properties (for example, charge mobility, photovoltaic properties, gas adsorption capacity or lithium-ion intercalation) to identify promising target candidates for synthetic efforts; determination of the structure–property relationship from large bodies of experimental data, enabled by integration with high-throughput synthesis and measurement tools; and clustering on the basis of similar crystallographic structure (for example, zeolite structure classification or gas adsorption properties). Here we demonstrate an alternative approach that uses machine-learning algorithms trained on reaction data to predict reaction outcomes for the crystallization of templated vanadium selenites. We used information on ‘dark’ reactions—failed or unsuccessful hydrothermal syntheses—collected from archived laboratory notebooks from our laboratory, and added physicochemical property descriptions to the raw notebook information using cheminformatics techniques. We used the resulting data to train a machine-learning model to predict reaction success. When carrying out hydrothermal synthesis experiments using previously untested, commercially available organic building blocks, our machine-learning model outperformed traditional human strategies, and successfully predicted conditions for new organically templated inorganic product formation with a success rate of 89 per cent. Inverting the machine-learning model reveals new hypotheses regarding the conditions for successful product formation. (Phys.org)—Science journalist, Nicola Nosengo has published a News Feature in the latest issue of the journal Nature, outlining the work being done to figure out how to use computers and databases to take on the tasks associated with discovering new and useful materials. In the same issue a team working at Haverford College, outlines a proposed method to use machine-learning-assisted materials discovery, via failed experiments. This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.last_img read more

Details