Post List

Computer Science / Engineering posts

(Modify Search »)

  • November 30, 2011
  • 01:22 AM
  • 1,047 views

Virtual Reality for Stress Management

by Dr Shock in Dr Shock MD PhD

Buffer Going to a relaxing zone in a natural park such as the river, waterfall, lake of garden with virtual reality and doing relaxing exercises supported by a relaxing narrative effectively reduces stress and anxiety. Virtual reality showed better improvements than video or audio although the latter two also reduced stress and anxiety. We found [...]


No related posts.... Read more »

  • November 29, 2011
  • 01:50 AM
  • 1,090 views

Online Disclosure greater than Offline Disclosure?

by Dr Shock in Dr Shock MD PhD

Buffer Most are afraid of greater online disclosure than offline disclosure. The computer luring us towards more information about ourselves than would probably be safe. Self-disclosure is the voluntary and verbal communication of personal information to a targeted recipient. It has three dimensions: frequency, breadth, and depth. Frequency of self-disclosure refers to the amount of [...]


No related posts.... Read more »

Nguyen, M., Bin, Y., & Campbell, A. (2011) Comparing Online and Offline Self-Disclosure: A Systematic Review. Cyberpsychology, Behavior, and Social Networking, 2147483647. DOI: 10.1089/cyber.2011.0277  

  • November 28, 2011
  • 08:56 PM
  • 904 views

The future of computation in drug discovery

by The Curious Wavefunction in The Curious Wavefunction

Computational chemistry as an independent discipline has its roots in theoretical chemistry, itself an outgrowth of the revolutions in quantum mechanics in the 1920s and 30s. Theoretical and quantum chemistry advanced rapidly in the postwar era and led to many protocols for calculating molecular and electronic properties which became amenable to algorithmic implementation once computers came on the scene. Rapid growth in software and hardware in the 80s and 90s led to the transformation of theoretical chemistry into computational chemistry and to the availability of standardized, relatively easy to use computer programs like GAUSSIAN. By the end of the first decade of the new century, the field had advanced to a stage where key properties of simple molecular systems such as energies, dipole moments and stable geometries could be calculated in many cases from first principles with an accuracy matching experiment. Developments in computational chemistry were recognized by the Nobel Prize for chemistry awarded in 1998 to John Pople and Walter Kohn.In parallel with these theoretical advances, another thread started developing in the 80s which attempted something much more ambitious- to apply the principles of theoretical and computational chemistry to complex systems like proteins and other biological macromolecules and to study their interactions with drugs. The practitioners of this paradigm wisely realized that it would be futile to calculate properties of such complex systems from first principles, thus leading to the initiation of parametrized approaches in which properties would be "pre-fit" to experiment rather than calculated ab initio. Typically there would be an extensive set of experimental data (the training set) which would be used to parametrize algorithms which would then be applied to unknown systems (the test set). The adoption of this approach led to molecular mechanics and molecular dynamics - both grounded in classical physics- and to quantitative structure activity relationships (QSAR) which sought to correlate molecular descriptors of various kinds to biological activity. The first productive approach to docking a small molecule in a protein active site in its lowest energy configuration was refined by Irwin "Tack" Kuntz at UCSF. And beginning in the 70s, Corwin Hansch at Pomona College had already made remarkable forays into QSAR.These methods gradually started to be applied to actual drug discovery in the pharmaceutical industry. Yet it was easy to see that the field was getting far ahead of itself and in fact even today it suffers from the same challenges that plagued it thirty years back. Firstly, nobody had solved the twin cardinal problems of modeling protein-ligand interactions. The first one was conformational sampling wherein you had to exhaustively search the conformation space of a ligand or protein. The second one was energetic ranking wherein you had to rank these structures, either in their isolated form or in the context of their interactions with a protein. Both of these problems remain the central problems of computation as applied to drug discovery. In the context of QSAR, spurious correlations based on complex combinations of descriptors can easily befuddle its practitioners and create an illusion of causation. Furthermore, there have been various long-standing problems such as the transferability of parameters from a known training set to an unknown test set, the calculation of solvation energies for even the simplest molecules and the estimation of entropies. And finally, it's all too easy to forget the sheer complexity of the protein systems we are trying to address which display a stunning variety of behaviors, from large conformational changes to allosteric binding to complicated changes in ionization states and interactions with water. The bottom line is that in many cases we just don't understand the system which we are trying to model well enough.Not surprisingly, a young field still plagued with multiple problems could be relied upon as no more than a guide when it came to solving practical problems in drug design. Yet the discipline saw unfortunate failures in PR as it was periodically hyped. Even in the 80s there were murmurs about designing drugs using computers alone. Part of the hype unfortunately came from the practitioners themselves who were less than cautious about announcing the strengths and limitations of their approaches. The consequence was that although there continued to be significant advances in both computing power and algorithms, many in the drug discovery community looked at the discipline with a jaundiced eye.Yet the significance of the problems that the field is trying to address means that it will continue to be promising. What's its future and what would be the most productive direction in which it could be steered? An interesting set of thoughts is offered in a set of articles published in the Journal of Computer-Aided molecular design. The articles are written by experienced practitioners in the field and offer a variety of opinions, critiques and analyses which should be read by all those interested in the future of modeling in the life sciences.Jurgen Bajorath from the University of Bonn along with his fellow modelers from Novartis laments the fact that studies in the field have not aspired to a high standard of validation, presentation and reproducibility. This is an important point. No scientific field can advance if there is wide variation in the presentation of the quality of its results. When it comes to modeling in drug discovery, the proper use of statistics and well-defined metrics has been highly subjective, leading to great difficulty in separating the wheat from the chaff and honestly assessing the impact of specific techniques. Rigorous statistical validation in particular has been virtually non-existent, with the highly suspect correlation coefficients being the most refined weapon of choice for many scientists in the field. An important step in emphasizing the virtue of objective statistical methods in modeling was taken by Anthony Nicholls of OpenEye Software who in a series of important articles laid out the statistical standards and sensible metrics that any well-validated molecular modeling study should aspire to. I suspect that these articles will go down in the annals of the field as key documents.In addition, as MIT physics professor Walter Lewin is fond of constantly emphasizing in his popular lectures, any measurement you make without knowledge of its uncertainty is meaningless. It is remarkable that in a field as fraught with complexity as modeling, there has been a rather insouciant indifference to the estimation of error and uncertainty. Modelers egregiously quote numbers involving protein-ligand energies, dipole moments and other properties to four or six figures of significance when ideally those numbers are suspect even to one decimal point. Part of the problem has simply been an insufficient grounding in statistics. Tying every number to its estimated error margin (if it can be estimated at all) will not only give experimentalists and other modelers an accurate feel for the validity of the analysis and the ensuing improvement of methods but will also keep semi-naive interpreters from being overly impressed by the numbers. Whether it's finance or pharmaceutical modeling, it's always a bad idea to get swayed by figures.Then there's the whole issue, as the modelers from Novartis emphasize, of spreading the love. The past few years have seen the emergence of several rigorously constructed datasets carefully designed to test and benchmark different modeling algorithms. The problem is that these datasets have been most often validated in an industry that's famous for its secrecy. Until the pharmaceutical industry makes at least some efforts to divulge the results of its studies, a true assessment of the value of modeling methods will always come in fits and starts. I have been recently reading Michael Nielsen's eye-opening book on open science, and it's startling to realize the gains in advancement of knowledge that can result from sharing of problems, solutions and ideas. If modeling is to advance and practically contribute to drug discovery, it's imperative for industry - historically the most valuable generator of any kind of data in drug discovery - to open its vaults and allow scientists to use its wisdom t... Read more »

  • November 28, 2011
  • 01:50 PM
  • 590 views

TED Talk: Oxytocin—the moral molecule

by Cath in Basal Science (BS) Clarified

In this TED talk Dr. Paul Zak, a professor at Claremont Graduate University in Southern California, describes how oxytocin is responsible for empathy and why it is the “moral molecule” in humans. I was intrigued by the title of this talk because I had always thought morality was something you learn and are not born [...]... Read more »

Baumgartner, T., Heinrichs, M., Vonlanthen, A., Fischbacher, U., & Fehr, E. (2008) Oxytocin Shapes the Neural Circuitry of Trust and Trust Adaptation in Humans. Neuron, 58(4), 639-650. DOI: 10.1016/j.neuron.2008.04.009  

Domes, G., Heinrichs, M., Michel, A., Berger, C., & Herpertz, S. (2007) Oxytocin Improves “Mind-Reading” in Humans. Biological Psychiatry, 61(6), 731-733. DOI: 10.1016/j.biopsych.2006.07.015  

Kosfeld, M., Heinrichs, M., Zak, P., Fischbacher, U., & Fehr, E. (2005) Oxytocin increases trust in humans. Nature, 435(7042), 673-676. DOI: 10.1038/nature03701  

Jorge Moll, Roland Zahn, Ricardo de Oliveira-Souza, Frank Krueger, & Jordan Grafman. (2005) The neural basis of human moral cognition. Nature Reviews Neuroscience, 799. info:/

ZAK, P., KURZBAN, R., & MATZNER, W. (2004) The Neurobiology of Trust. Annals of the New York Academy of Sciences, 1032(1), 224-227. DOI: 10.1196/annals.1314.025  

ZAK, P., KURZBAN, R., & MATZNER, W. (2005) Oxytocin is associated with human trustworthiness. Hormones and Behavior, 48(5), 522-527. DOI: 10.1016/j.yhbeh.2005.07.009  

  • November 28, 2011
  • 08:00 AM
  • 707 views

Oracles Past and Present: Our Means of Managing Information

by Krystal D'Costa in Anthropology in Practice

Our ability to find and share information today is potentially limitless. But how did we get here? From cave paintings to the iPad—how does human innovation bring us here? Go Ask the Oracle We live in an amazing time: We never have to wait to know. At this very moment you could be on a [...]









... Read more »

Hargittai, E. (2002) Second-level digital divide: Differences in people’s online skills'. First Monday, Peer-Reviewed Journal of the Internet., 7(4). info:/

  • November 22, 2011
  • 11:30 AM
  • 764 views

22 website quality markers

by David Bradley in Sciencetext

Many factors affect the web experience and perception of the quality of a website. Writing in the rather appropriately named International Journal of Information Quality, Jaikrit Kandari of the University of Nebraska, Lincoln and colleagues there and at The University of Texas at Arlington, have outlined 21 factors that could be used as a framework [...]Post from: David Bradley's Sciencetext Tech Talk22 website quality markers
... Read more »

Jaikrit Kandari, Erick C. Jones, Fiona Fui-Hoon Nah, & Ram R. Bishu. (2011) Information quality on the World Wide Web: development of a framework. Int. J. Information Quality, 2(4), 324-343. info:/

  • November 21, 2011
  • 09:35 PM
  • 531 views

Role of Porous Medium Modelling in Biothermofluids

by Arunn in nOnoScience (a.k.a. Unruled Notebook)

Biothermology or Bio- fluid flow and heat transfer is an important and developing subdivision of bioengineering. Seeking simplifications for biological processes that are inherently complex, is an exciting and useful multidisciplinary pursuit. Recently, I was invited to write a review article on the role of porous medium modelling in biothermofluids for the IISc Journal, a … Continue reading »... Read more »

Arunn Narasimhan. (2011) The Role of Porous Medium Modeling in Biothermofluids. Journal of the Indian Institute of Science, 91(3), 243-266. info:other/

  • November 16, 2011
  • 12:58 PM
  • 1,102 views

Renewable energy rises from the ashes

by Charles Harvey in Charles Harvey - Science Communicator

Old oil and gas wells might soon be reborn as environmentally friendly geothermal power generators. Over $36,000 of electricity could be generated from each retrofitted well. ... Read more »

  • November 16, 2011
  • 08:09 AM
  • 750 views

Video Tip of the Week: MapMi, automated mapping of microRNA loci

by Trey in OpenHelix

Today’s video tip of the week is on MapMi. This tool is found at EBI and was developed by the Enright lab. The purpose of this tool is a computational system for mapping of miRNAs within and across species. As the abstract of their recent paper says:  Currently miRBase is their primary repository, providing annotations [...]... Read more »

Guerra-Assuncao, J., & Enright, A. (2010) MapMi: automated mapping of microRNA loci. BMC Bioinformatics, 11(1), 133. DOI: 10.1186/1471-2105-11-133  

  • November 16, 2011
  • 05:28 AM
  • 857 views

New mobile battery is chargable within 15 minutes and lasts a week

by United Academics in United Academics

Fed up with re-charging your smart phone every single day? Good news: batteries for phones and laptops will soon be able to recharge ten times faster than they are today. Furthermore, these batteries hold a charge ten times larger than current technology allows.... Read more »

Xin Zhao, Cary M. Hayner, Mayfair C. Kung, & Harold H. Kung. (2011) In-Plane Vacancy-Enabled High-Power Si–Graphene Composite Electrode for Lithium-Ion Batteries. Advanced Energy Materials, 1(6), 1079-1084. info:/10.1002/aenm.201100426

  • November 15, 2011
  • 03:28 PM
  • 647 views

How to (hopefully) not drown in data

by Emma in we are all in the gutter

More is better, right? Bigger telescopes and bigger surveys are both undoubtedly good things, but to make the best use of these advances we need to be able to handle the corresponding increase in data flow, and subsequent pressure on the astronomical archives which are going to have to cope with it. This is a [...]... Read more »

G. Bruce Berriman, & Steven L. Groom. (2011) How Will Astronomy Archives Survive The Data Tsunami?. ACM Queue. arXiv: 1111.0075v1

  • November 13, 2011
  • 07:01 PM
  • 1,134 views

Something you should know about: Quantifier Elimination (Part I)

by Aaron Sterling in CSTheory StackExchange Community Blog

by Arnab Bhattacharyya   About a month ago, Ankur Moitra dropped by my office. We started chatting about what each of us was up to. He told me a story about a machine learning problem that he was working on with Sanjeev Arora, Rong Ge, and Ravi Kannan. On its face, it was not even [...]... Read more »

Alfred Tarski. (1951) A Decision Method for Elementary Algebra and Geometry. Rand Corporation. info:/

  • November 10, 2011
  • 01:41 PM
  • 949 views

Fantastic free new academic search tool for tracking down -ve findings

by Neurobonkers in Neurobonkers

Does what it says on the tin.... Read more »

  • November 9, 2011
  • 11:44 AM
  • 479 views

Bloodhound Beads Sniffing out Heart Attacks

by Hector Munoz in Microfluidic Future

How would you detect a heart attack? There are some symptoms that might tell you that you are very likely having a heart attack. Although you might feel pain in the chest, shortness of breath or other known physical symptoms, that doesn’t mean you in are actually having one. Conversely, you may not experience these symptoms but an attack is well on its way. In addition to painful symptoms, an electrocardiogram can be used to further indicate if you’re having a heart attack, but it also isn’t always accurate. But what if you could detect a heart attack by monitoring cardiac specific biomarkers in the blood or saliva? Those attempts are well underway.... Read more »

Du, N., Chou, J., Kulla, E., Floriano, P., Christodoulides, N., & McDevitt, J. (2011) A disposable bio-nano-chip using agarose beads for high performance immunoassays. Biosensors and Bioelectronics, 28(1), 251-256. DOI: 10.1016/j.bios.2011.07.027  

  • November 9, 2011
  • 11:43 AM
  • 763 views

Grasping a new reality

by Cath in Basal Science (BS) Clarified

Need an extra hand?

Mindful of the ageing population in Japan, engineers are motivated to develop robots that could one day assist the elderly in their daily tasks. Developing a robotic hand that shows dexterity and variable pressure capabilities is the first step in creating robots that will be able to perform everyday tasks.

In the October issue of Smart Materials and Synthesis (doi:10.1088/0964-1726/20/10/105015), Dr. Nagase of Kwansei Gakuin University and his colleagues describe their design for a new robotic hand that mimics the dexterity and grasping abilities of a human hand. Not only does the hand perform similarly to a human hand, its size and weight are almost identical to the average adult hand.... Read more »

Nagase, J., Wakimoto, S., Satoh, T., Saga, N., & Suzumori, K. (2011) Design of a variable-stiffness robotic hand using pneumatic soft rubber actuators. Smart Materials and Structures, 20(10), 105015. DOI: 10.1088/0964-1726/20/10/105015  

  • November 7, 2011
  • 12:30 PM
  • 1,022 views

Robot spider terrorises office floor - video

by Charles Harvey in Charles Harvey - Science Communicator

Yes, I know its only got 4 legs, but you can’t say it doesn’t look a little bit spidery. Built by scientists from the Korea Institute of Industrial Technology, this quadruped robot scans and analyses its immediate surroundings. Like a daredevil rockclimber, each surface is scrutinised for the potential to be the next foot or handhold. While one wrong move could mean certain death for a climber 1000ft up, the robot’s actions are a little less hardcore. In the video above, you can see it traversing a not-so-dangerous world full of stacks of paper. ... Read more »

  • November 7, 2011
  • 05:30 AM
  • 652 views

Evaluating dry eyes

by Pablo Artal in Optics confidential

Dry eye is a condition affecting millions of people. Surprisingly there is a lack of objective methods to evaluate tear film quality. A new optical method is described here, discussing implications and more...... Read more »

  • November 2, 2011
  • 03:58 AM
  • 738 views

Scientists reach new heights with gecko-inspired robot

by GrrlScientist in Maniraptora

SUMMARY: Engineers finally succeed at building a robot that climbs smooth walls with ease and shuffles across ceilings without crashing to earth -- just like a gecko! ... Read more »

J Krahn, Y Liu, A Sadeghi, & C Menon. (2011) A tailless timing belt climbing platform utilizing dry adhesives with mushroom caps. . Smart Materials and Structures, 20(11), 115021. info:/10.1088/0964-1726/20/11/115021

  • October 28, 2011
  • 11:41 AM
  • 499 views

PDMS: The Favorite Material of Microfluidics (for now)

by Hector Munoz in Microfluidic Future

Whether you’ve been learning about microfluidics here at Microfluidic Future or somewhere else, you’ve undoubtedly come across the elastomer poly(dimethylsiloxane) (PDMS). PDMS has radically changed the capabilities of microfluidics (and its price tag) since it was first brought into microfluidics by George Whitesides in 1998. PDMS has effectively replaced glass and silicon which were borrowed from existing micromachining industries. PDMS has great resolution and can contain sub-0.1 µm features. But how is PDMS used, and what makes it so great? Hopefully you’ll have these answers by the end of this post.... Read more »

  • October 25, 2011
  • 02:14 PM
  • 391 views

Structure of Plumes from Burning Aluminized Propellant Estimated Using Fan Beam Emission Tomography

by L. Whitson Jr. in L.Bryce Whitson Jr.

This article introduces the use of Fan Beam Emission Tomography (FBET) to visualize the flame structure of a solid propellant plume. Direct measurement of the scalar values is not feasible...... Read more »

join us!

Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit seedmediagroup.com.