Apple considered iPhone with physical keyboard? Wait, what?


One little decision can provoke so much.

It seems that, in those days when everyone believed that BlackBerrys were the most extraordinary machines on the planet, Apple was still cogitating over its little iPod-phone thingy.

And apparently one of the options the company considered was to have a physical keyboard. Yes, like the BlackBerry. With real physical buttons.

 http://www.youtube.com/watch?feature=player_embedded&v=4acWkNihaxc

 
Image
Fadell — who left Apple to create learning thermostat companyNest — offered that there were three designs being considered — one involving a hardware keyboard.

One imagines this might have involved the keyboard sliding out of the phone. You know, like, well, all those wonderful phones that still have that design.

Some might muse that it’s something of a relief that Apple committed itself to touch-screen technology, something that makes using a smartphone peculiarly pleasant. However, what would have happened if Apple had gone with a physical keyboard?

Would everyone else have decided that because Apple is doing it, that must mean it’s cool? Or would some other enterprising company have been the first to go with its instincts and created the first touch-screen smartphone?

Which company might that have been? Microsoft, surely.

Dell says XPS 13 ultrabook exceeds sales expectations


The XPS 13 ultrabook is selling well above expectations, a Dell executive told CNET this week, offering some hope for the new class of skinny laptops.

“A little bit less than 3X the expected demand,” he said. Burd declined to be more specific, saying Dell “never” discloses numbers.

Still, an upbeat statement about sales — however nonspecific — is good news. Industry observers are watching the category closely to see if it can succeed and take some of the wind out of the sails of the ImageMacBook Air and ipad The latter is selling at a blistering pace of more than 10 million a quarter.

“I’m optimistic in the long run about ultrabooks,” said Stephen Baker, an analyst at the NPD Group.

He says PC makers and retailers need to get off the “$399 treadmill” by cutting back on the number of models and making more money off the ones that remain. “Look at the iPad. People are willing to pay $600 or $700 for something that gives them a great experience. Something that looks good and makes them feel comfortable and confident,” he said.

The passes the good-looks test. And it’s thin and light (0.71 inches, 3 pounds).

But it’s not cheap, starting at $999. So, why is it selling so well? “Half the sales of the XPS 13 are coming from enterprise [large corporate] customers. That’s a lot of its success,” Burd said.

Related stories

And that’s one of the bigger challenges for Dell — to straddle the consumer and corporate markets with a single design. For those who haven’t noticed, Dell is becoming more of a corporate enterprise-centric company and less of a consumer outfit. So, designs like the XPS 13 that appeal to both sets of customers are an imperative.

This trend is sometimes referred to as the “consumerization” of IT: employees bringing their personal devices — like iPads — to work.

Burd says the XPS 13 inherits some of the traits that make the iPad and smartphone so popular. “We took the things that an iPad or smartphone does well, in terms of booting up quickly, being highly mobile…and then took that even further. You can do productivity and not lose anything,” he said, referring to common business tasks like word processing and spreadsheets.

But it’s still corporate-capable. “We can load a company’s image on the system, we can put custom BIOS settings on the system, an asset tag so they can track it,” he said.

This is a different tack than the company took with its original ultrathin laptop, the Adamo. That aluminum-clad, 0.65-inch thick design — announced back in early 2009 — was the first thoughtful response to the MacBook Air from a first-tier PC maker. But it was not marketed alternatively as a corporate workhorse like the XPS 13.

“The [Adamo] design was cutting edge [and] ended up being great looking but an expensive system with less power. It was run off ULV [ultra-low-voltage] processors that at that time were a lot slower,” he said. The XPS 13 — designed in Austin by Dell — uses much faster Sandy Bridge processors today.

What’s next for Dell? “We think touch becomes a pretty interesting option for products that have Windows 8 loaded on them,” Burd said. But that won’t happen automatically. “Touch adds cost…part of it becoming standard is that people need to see the value of that. It’s still a pretty significant added cost, adding capacitive touch,” he said.

And expect more XPS and Inspiron (Dell’s consumer brand) models later. “We’ll have sister, brother products to the XPS 13 that will build out that portfolio and we’ll have a new design language for the Inspiron too,” he said.

Comparing apples and oranges


Every year, U.S. supermarkets lose roughly 10 percent of their fruits and vegetables to spoilage, according to the Department of Agriculture. To help combat those losses, MIT chemistry professor Timothy Swager and his students have built a new sensor that could help grocers and food distributors better monitor their produce. 

ImageThe new sensors,  can detect tiny amounts of ethylene, a gas that promotes ripening in plants. Swager envisions the inexpensive sensors attached to cardboard boxes of produce and scanned with a handheld device that would reveal the contents’ ripeness. That way, grocers would know when to put certain items on sale to move them before they get too ripe.

“If we can create equipment that will help grocery stores manage things more precisely, and maybe lower their losses by 30 percent, that would be huge,” says Swager, the John D. MacArthur Professor of Chemistry.

Detecting gases to monitor the food supply is a new area of interest for Swager, whose previous research has focused on sensors to detect explosives or chemical and biological warfare agents.

“Food is something that is really important to create sensors around, and we’re going after food in a broad sense,” Swager says. He is also pursuing monitors that could detect when food becomes moldy or develops bacterial growth, but as his first target, he chose ethylene, a plant hormone that controls ripening.

Plants secrete varying amounts of ethylene throughout their maturation process. For example, bananas will stay green until they release enough ethylene to start the ripening process. Once ripening begins, more ethylene is produced, and the ripening accelerates. If that perfect yellow banana is not eaten at peak ripeness, ethylene will turn it brown and mushy.

Fruit distributors try to slow this process by keeping ethylene levels very low in their warehouses. Such warehouses employ monitors that use gas chromatography or mass spectroscopy, which can separate gases and analyze their composition. Those systems cost around $1,200 each.

“Right now, the only time people monitor ethylene is in these huge facilities, because the equipment’s very expensive,” Swager says.

Detecting ripeness

Funded by the U.S. Army Office of Research through MIT’s Institute for Soldier Nanotechnologies, the MIT team built a sensor consisting of an array of tens of thousands of carbon nanotubes: sheets of carbon atoms rolled into cylinders that act as “superhighways” for electron flow.

To modify the tubes to detect ethylene gas, the researchers added copper atoms, which serve as “speed bumps” to slow the flowing electrons. “Anytime you put something on these nanotubes, you’re making speed bumps, because you’re taking this perfect, pristine system and you’re putting something on it,” Swager says.

Copper atoms slow the electrons a little bit, but when ethylene is present, it binds to the copper atoms and slows the electrons even more. By measuring how much the electrons slow down — a property also known as resistance — the researchers can determine how much ethylene is present. 

To make the device even more sensitive, the researchers added tiny beads of polystyrene, which absorbs ethylene and concentrates it near the carbon nanotubes. With their latest version, the researchers can detect concentrations of ethylene as low as 0.5 parts per million. The concentration required for fruit ripening is usually between 0.1 and one part per million.  

The researchers tested their sensors on several types of fruit — banana, avocado, apple, pear and orange — and were able to accurately measure their ripeness by detecting how much ethylene the fruits secreted.

Lead author of the paper describing the sensors is Birgit Esser, a postdoc in Swager’s lab. Grad student Jan Schnorr is also an author of the paper.

John Saffell, the technical director at Alphasense, a company that develops sensors, describes the MIT team’s approach as rigorous and focused. “This sensor, if designed and implemented correctly, could significantly reduce the level of fruit spoilage during shipping,” he says.

“At any given time, there are thousands of cargo containers on the seas, transporting fruit and hoping that they arrive at their destination with the correct degree of ripeness,” adds Saffell, who was not involved in this research. “Expensive analytical systems can monitor ethylene generation, but in the cost-sensitive shipping business, they are not economically viable for most of shipped fruit.”

Swager has filed for a patent on the technology and hopes to start a company to commercialize the sensors. In future work, he plans to add a radio-frequency identification (RFID) chip to the sensor so it can communicate wirelessly with a handheld device that would display ethylene levels. The system would be extremely cheap — about 25 cents for the carbon nanotube sensor plus another 75 cents for the RFID chip, Swager estimates.

Physicists Benchmark Quantum Simulator With Hundreds of Qubits


SCİENCEDAİLY
Many important problems in physics — especially low-temperature physics — remain poorly understood because the underlying quantum mechanics is vastly complex. Conventional computers — even supercomputers — are inadequate for simulating quantum systems with as few as 30 particles. Better computational tools are needed to understand and rationally design materials, such as high-temperature superconductors, whose properties are believed to depend on the collective quantum behavior of hundreds of particles.Image

The NIST simulator consists of a tiny, single-plane crystal of hundreds of beryllium ions, less than 1 millimeter in diameter, hovering inside a device called a Penning trap. The outermost electron of each ion acts as a tiny quantum magnet and is used as a qubit — the quantum equivalent of a “1” or a “0” in a conventional computer. In the benchmarking experiment, physicists used laser beams to cool the ions to near absolute zero. Carefully timed microwave and laser pulses then caused the qubits to interact, mimicking the quantum behavior of materials otherwise very difficult to study in the laboratory. Although the two systems may outwardly appear dissimilar, their behavior is engineered to be mathematically identical. In this way, simulators allow researchers to vary parameters that couldn’t be changed in natural solids, such as atomic lattice spacing and geometry. In the NIST benchmarking experiments, the strength of the interactions was intentionally weak so that the simulation remained simple enough to be confirmed by a classical computer. Ongoing research uses much stronger interactions.

Simulators exploit a property of quantum mechanics called superposition, wherein a quantum particle is made to be in two distinct states at the same time, for example, aligned and anti-aligned with an external magnetic field. So the number of states simultaneously available to 3 qubits, for example, is 8 and this number grows exponential with the number of qubits: 2N states for N qubits.

Crucially, the NIST simulator also can engineer a second quantum property called entanglement between the qubits, so that even physically well separated particles may be made tightly interconnected.

Recent years have seen tremendous interest in quantum simulation; scientists worldwide are striving to build small-scale demonstrations. However, these experiments have yet to fully involve more than 30 quantum particles, the threshold at which calculations become impossible on conventional computers. In contrast, the NIST simulator has extensive control over hundreds of qubits. This order of magnitude increase in qubit-number increases the simulator’s quantum state space exponentially. Just writing down on paper a state of a 350-qubit quantum simulator is impossible — it would require more than a googol of digits: 10 to the power of 100.

Over the past decade, the same NIST research group has conducted record-setting experiments in quantum computing, atomic clocks and, now, quantum simulation. In contrast with quantum computers, which are universal devices that someday may solve a wide variety of computational problems, simulators are “special purpose” devices designed to provide insight about specific problems.

This work was supported in part by the Defense Advanced Research Projects Agency. Co-authors from Georgetown University, North Carolina State University and in South Africa and Australia contributed to the research.

Researcher misinterprets Oracle advisory, discloses unpatched database vulnerability


Instructions on how to exploit an unpatched Oracle Database Server vulnerability in order to intercept the information exchanged between clients and databases were published by a security researcher who erroneously thought that the company had patched the flaw.

Oracle’s April 2012 Critical Patch Update (CPU) advisory, published on April 17, credited security researcher Joxean Koret for a vulnerability he reported through cyber intelligence firm iSight Partners.

[ In a major finding, InfoWorld uncovered a fundamental Oracle flaw and its repercussions for database customers. | Learn how to secure your systems with Roger Grimes' Security Adviser blog and Security Central newsletter, both from InfoWorld. | Subscribe to the InfoWorld Daily newsletter to make sure you don't miss an article. ]

In an email sent to the Full Disclosure mailing list on April 18, Koret revealed that the vulnerability is located in the Oracle TNS Listener, a component that routes connections from clients to Oracle database servers depending on which database they are trying to reach.

TNS Listener has a default feature, introduced in 1999, that allows clients to register a database service or database instance remotely without authentication, Koret said.

The client sends a remote registration request to the TNS Listener and defines a new service name, its IP address, the database instances under it, and other settings. The TNS Listener then starts routing all client requests that include that service name or database instance.

However, TNS Listener also allows the remote registration of a database instance or service name that is already registered, Koret said. “The TNS listener will consider this newer registered instance name a cluster instance (Oracle RAC, Real Application Clusters) or a fail over instance (Oracle Fail over),” he said.

In this case, the TNS Listener performs load balancing between the two instances by sending the first client to the most recently registered one and the second client to the original one. This allows a local attacker to route between 50 and 75 percent of clients to a database server that he controls, Koret said.

The attacker can then use the TNS Listener on the server he controls to route the client requests back to the legitimate database instance, effectively establishing a TNS proxy that allows him to intercept all data exchanged between clients and the targeted database.

However, this is not the only attack scenario that this vulnerability allows. By being in a man-in-the-middle situation, the attacker can also inject rogue commands in the SQL queries sent by clients or completely hijack their sessions to execute arbitrary queries, Koret said.

The researcher mentioned that he didn’t test whether Oracle’s patch for this vulnerability, that he believed to be included in the April 2012 CPU, actually addressed all attack vectors.

However, after a few follow-up emails with Oracle, he realized that the company hadn’t actually patched the flaw for currently supported versions of the database server, but instead addressed it in an yet-to-be-released version.

MIT to host 2013 American Nuclear Society Student Conference


The MIT American Nuclear Society Student Section has won the bid to host the 2013 ANS Student Conference in Spring 2013. A team of more than 30 undergraduate and graduate students from the Department of Nuclear Science and Engineering at MIT crafted the successful proposal to bring the conference back to MIT. The conference, which has grown significantly in attendance, visibility and stature over the last decade, has been hosted at MIT three times previously, but not since 1994.

The proposed conference theme, “Public Image of the Nuclear Engineer” is aimed at developing awareness of political challenges and inspiring young nuclear engineers to engage with society in ways that reflect positively on nuclear technology. It is derived, in part, from the Department of Nuclear Science and Engineering educational hallmark: Science-Systems-Society.

MIT-ANS will film and launch a video outreach project at the 2013 conference titled “I’m a Nuke.” The goal is to break the old stereotype of nuclear engineers and introduce the public to today’s nuclear scientists and engineers — young, diverse, and as Steve Jobs would say, “insanely great.”
Image

Algorithmic incentives


In their groundbreaking 1985 paper on the topic, Goldwasser, Micali and the University of Toronto’s Charles Rackoff ’72, SM ’72, PhD ’74 proposed a particular kind of interactive proof, called a zero-knowledge proof, in which a player can establish that he or she knows some secret information without actually revealing it. Today, zero-knowledge proofs are used to secure transactions between financial institutions, and several startups have been founded to commercialize them.
Image

At the Association for Computing Machinery’s Symposium on Theory of Computing in May, Micali, the Ford Professor of Engineering at MIT, and graduate student Pablo Azar will present a new type of mathematical game that they’re calling a rational proof; it varies interactive proofs by giving them an economic component. Like interactive proofs, rational proofs may have implications for cryptography, but they could also suggest new ways to structure incentives in contracts.

“What this work is about is asymmetry of information,” Micali adds. “In computer science, we think that valuable information is the output of a long computation, a computation I cannot do myself.” But economists, Micali says, model knowledge as a probability distribution that accurately describes a state of nature. “It was very clear to me that both things had to converge,” he says.

A classical interactive proof involves two players, sometimes designated Arthur and Merlin. Arthur has a complex problem he needs to solve, but his computational resources are limited; Merlin, on the other hand, has unlimited computational resources but is not trustworthy. An interactive proof is a procedure whereby Arthur asks Merlin a series of questions. At the end, even though Arthur can’t solve his problem himself, he can tell whether the solution Merlin has given him is valid.

In a rational proof, Merlin is still untrustworthy, but he’s a rational actor in the economic sense: When faced with a decision, he will always choose the option that maximizes his economic reward. “In the classical interactive proof, if you cheat, you get caught,” Azar explains. “In this model, if you cheat, you get less money.”

Complexity connection

Research on both interactive proofs and rational proofs falls under the rubric of computational-complexity theory, which classifies computational problems according to how hard they are to solve. The two best-known complexity classes are P and NP. Roughly speaking, P is a set of relatively easy problems, while NP contains some problems that, as far as anyone can tell, are very, very hard.

Problems in NP include the factoring of large numbers, the selection of an optimal route for a traveling salesman, and so-called satisfiability problems, in which one must find conditions that satisfy sets of logical restrictions. For instance, is it possible to contrive an attendance list for a party that satisfies the logical expression (Alice OR Bob AND Carol) AND (David AND Ernie AND NOT Alice)? (Yes: Bob, Carol, David and Ernie go to the party, but Alice doesn’t.) In fact, the vast majority of the hard problems in NP can be recast as satisfiability problems.

To get a sense of how rational proofs work, consider the question of how many solutions a satisfiability problem has — an even harder problem than finding a single solution. Suppose that the satisfiability problem is a more complicated version of the party-list problem, one involving 20 invitees. With 20 invitees, there are 1,048,576 possibilities for the final composition of the party. How many of those satisfy the logical expression? Arthur doesn’t have nearly enough time to test them all.

But what if Arthur instead auctions off a ticket in a lottery? He’ll write down one perfectly random list of party attendees — Alice yes, Bob no, Carol yes and so on — and if it satisfies the expression, he’ll give the ticketholder $1,048,576. How much will Merlin bid for the ticket?

Suppose that Merlin knows that there are exactly 300 solutions to the satisfiability problem. The chances that Arthur’s party list is one of them are thus 300 in 1,048,576. According to standard econometric analysis, a 300-in-1,048,576 shot at $1,048,576 is worth exactly $300. So if Merlin is a rational actor, he’ll bid $300 for the ticket. From that information, Arthur can deduce the number of solutions.

First-round knockout

The details are more complicated than that, and of course, with very few exceptions, no one in the real world wants to be on the hook for a million dollars in order to learn the answer to a math problem. But the upshot of the researchers’ paper is that with rational proofs, they can establish in one round of questioning — “What do you bid?” — what might require millions of rounds using classical interactive proofs. “Interaction, in practice, is costly,” Azar says. “It’s costly to send messages over a network. Reducing the interaction from a million rounds to one provides a significant savings in time.”

“I think it’s yet another case where we think we understand what’s a proof, and there is a twist, and we get some unexpected results,” says Moni Naor, the Judith Kleeman Professorial Chair in the Department of Computer Science and Applied Mathematics at Israel’s Weizmann Institute of Science. “We’ve seen it in the past with interactive proofs, which turned out to be pretty powerful, much more powerful than you normally think of proofs that you write down and verify as being.” With rational proofs, Naor says, “we have yet another twist, where, if you assign some game-theoretical rationality to the prover, then the proof is yet another thing that we didn’t think of in the past.”

Naor cautions that the work is “just at the beginning,” and that it’s hard to say when it will yield practical results, and what they might be. But “clearly, it’s worth looking into,” he says. “In general, the merging of the research in complexity, cryptography and game theory is a promising one.”

Micali agrees. “I think of this as a good basis for further explorations,” he says. “Right now, we’ve developed it for problems that are very, very hard. But how about problems that are very, very simple?” Rational-proof systems that describe simple interactions could have an application in crowdsourcing, a technique whereby computational tasks that are easy for humans but hard for computers are farmed out over the Internet to armies of volunteers who receive small financial rewards for each task they complete. Micali imagines that they might even be used to characterize biological systems, in which individual organisms — or even cells — can be thought of as producers and consumers.

A camera that peers around corners


Image

In a paper appearing this week in the journal Nature Communications, the researchers describe using their system to produce recognizable 3-D images of a wooden figurine and of foam cutouts outside their camera’s line of sight. The research could ultimately lead to imaging systems that allow emergency responders to evaluate dangerous environments or vehicle navigation systems that can negotiate blind turns, among other applications.

The principle behind the system is essentially that of the periscope. But instead of using angled mirrors to redirect light, the system uses ordinary walls, doors or floors — surfaces that aren’t generally thought of as reflective.

The system exploits a device called a femtosecond laser, which emits bursts of light so short that their duration is measured in quadrillionths of a second. To peer into a room that’s outside its line of sight, the system might fire femtosecond bursts of laser light at the wall opposite the doorway. The light would reflect off the wall and into the room, then bounce around and re-emerge, ultimately striking a detector that can take measurements every few picoseconds, or trillionths of a second. Because the light bursts are so short, the system can gauge how far they’ve traveled by measuring the time it takes them to reach the detector.

The system performs this procedure several times, bouncing light off several different spots on the wall, so that it enters the room at several different angles. The detector, too, measures the returning light at different angles. By comparing the times at which returning light strikes different parts of the detector, the system can piece together a picture of the room’s geometry.

Off the bench

Previously, femtosecond lasers had been used to produce extremely high-speed images of biochemical processes in a laboratory setting, where the trajectories of the laser pulses were carefully controlled. “Four years ago, when I talked to people in ultrafast optics about using femtosecond lasers for room-sized scenes, they said it was totally ridiculous,” says Ramesh Raskar, an associate professor at the MIT Media Lab, who led the new research.

Andreas Velten, a former postdoc in Raskar’s group who is now at the Morgridge Institute for Research in Madison, Wis., conducted the experiments reported in Nature Communications using hardware in the lab of MIT chemist Moungi Bawendi, who’s collaborating on the project. Velten fired femtosecond bursts of laser light at an opaque screen, which reflected the light onto objects suspended in front of another opaque panel standing in for the back wall of a room.

The data collected by the ultrafast sensor were processed by algorithms that Raskar and Velten developed in collaboration with Otkrist Gupta, a graduate student in Raskar’s group; Thomas Willwacher, a mathematics postdoc at Harvard University; and Ashok Veeraraghavan, an assistant professor of electrical engineering and computer science at Rice University. The 3-D images produced by the algorithms were blurry but easily recognizable.

Raskar envisions that a future version of the system could be used by emergency responders — firefighters looking for people in burning buildings or police determining whether rooms are safe to enter — or by vehicle navigation systems, which could bounce light off the ground to look around blind corners. It could also be used with endoscopic medical devices, to produce images of previously obscure regions of the human body.

The math required to knit multiple femtosecond-laser measurements into visual images is complicated, but Andrew Fitzgibbon, a principal researcher at Microsoft Research who specializes in computer vision, says it does build on research in related fields. “There are areas of computer graphics which have used that sort of math,” Fitzgibbon says. “In computer graphics, you’re making a picture. Applying that math to acquiring a picture is a great idea.” Raskar adds that his team’s image-reconstruction algorithm uses a technique called filtered backprojection, which is the basis of CAT scans.

Indeed, Fitzgibbon says, the real innovation behind the project was the audacity to try it. “Coming at it from both ends, from the raw scientific question — because, you know, it is kind of a scientific question: ‘Could we see around a corner?’ — to the extreme engineering of it — ‘Can we time these pulses to femtoseconds?’ — that combination, I think, is rare.”

In its work so far, Raskar says, his group has discovered that the problem of peering around a corner has a great deal in common with that of using multiple antennas to determine the direction of incoming radio signals. Going forward, Raskar hopes to use that insight to improve the quality of the images the system produces and to enable it to handle visual scenes with a lot more clutter.

A tough calculation


Why don’t more women enter the male-dominated profession of engineering? Some observers have speculated it may be due to the difficulties of balancing a demanding career with family life. Others have suggested that women may not rate their own technical skills highly enough.

However, a recent paper co-authored by MIT social scientist Susan Silbey, based on a four-year study of female engineering students, offers a different story. Contrary to the stereotype, the study finds, women are no more hesitant than men when it comes to mixing family and work. Moreover, their self-assessments of their math skills do not predict whether they will stick with engineering. Instead, the study finds, women feel less comfortable in engineering than men, and lack the “professional role confidence” that male engineers seem to acquire easily.

“The further they get from the classroom, the more women don’t like the experience,” says Silbey, the Leon and Anne Goldberg Professor of Humanities and professor of sociology and anthropology at MIT. “They find there is too large a gap between the idea of being an engineer and the practice of it.” Women who have internships or jobs, she explains, find they “are too often relegated to ‘female’ roles of note-taker, organizer or manager,” and “don’t think they want to do this kind of work.”

Willing to balance family and work

In the study, a team of researchers tracked the progress of 720 students — more than 300 of them in engineering programs — between 2003 and 2007 at four institutions in Massachusetts: MIT, the Franklin W. Olin College of Engineering, Smith College and the University of Massachusetts at Amherst. The team gathered information about the students’ performance and experiences in the engineering profession from surveys, student diaries, interviews with faculty and administrators, and classroom visits.

The results are detailed in a paper, “Professional Role Confidence and Gendered Persistence in Engineering,” published recently in the American Sociological Review.  

The researchers found that women in the engineering programs were twice as likely as men to switch to other science, technology, engineering and math (STEM) majors, and that men report higher levels of “intentional persistence,” meaning they are more inclined to picture themselves as engineers five years in the future. The women and men in the study earned similar grades, and the study controlled for classroom achievement, meaning the data shows divergent decisions made by students of similar caliber. And while the men did express more confidence in their math skills, self-confidence about technical skills did not correspond strongly to the career decisions of women.

Surprisingly, the men in the study were seemingly more daunted than the women by the prospect of balancing family commitments with careers. “The women who voiced stronger intention to have families were more likely to stay [in engineering], and the men who voiced stronger intention to have families were more likely to leave,” Silbey says. “We do not have an explanation yet for that, but it’s a fact that needs to be explored.”

The critical factor shaping the decisions of women, however, was their perception of the engineering workplace. Some women in the study arrived at this view through bad experiences in engineering internships. As one student at the University of Massachusetts told the researchers: “The people whom I work with don’t take me seriously. Not everyone does this, but a fair amount of the older men in my working environment do this. They’ll treat me like I know nothing and I’m only working … because my dad works there. What they don’t know is that I have a 3.7 GPA and am practically acing all of my engineering classes.”

As a result, the paper notes, many women find it difficult to “bear the burden of proving to others that, despite gendered expectations, they are skilled engineers,” and seek other professional disciplines.

Silbey’s co-authors on the paper are Erin Cech, a postdoctoral scholar at Stanford University and the lead author of the paper; Brian Rubineau ’93, PhD ’07, a professor of organizational behavior at Cornell University; and Caroll Seron, a sociologist at the University of California at Irvine. The study became part of the independent doctoral research of Cech and Rubineau. It was conducted with the assistance of the MIT and Cornell survey units, and supported by the National Science Foundation.

Series of studies underway

To be sure, problems of gender integration in the workplace are hardly limited to engineering. However, as Silbey notes, many other white-collar professions that have been historically male-dominated in the past, such as law, have seen greater shifts in terms of gender representation.

Silbey suggests that this particular contrast may have occurred because the legal profession more easily accommodates competing intellectual perspectives; or as she says, the law is “a basically pluralist, heterogeneous environment that is tolerant of variation,” at least in comparison to engineering. That characteristic may have made it easier for a critical mass of women to enter law in the 1960s and 1970s.

However, as the authors state in the paper, they “hope others will continue this research with larger samples and extend it to other professions,” to draw a more complete picture of how “professional role confidence” affects career choices.

Advocates for women in the engineering workplace say the study sheds light on a phenomenon that will require continued analysis. “We owe a debt to the authors for their research into a little-understood persistence from credential acquisition to career practice,” says Betty Shanahan, executive director and CEO of the Society of Women Engineers, a nonprofit group in Chicago. Shanahan adds, “Their insight and hopefully subsequent research can offer academic institutions, professional societies, and the engineering profession with interventions to increase the persistence of qualified women — and men — into engineering careers.”

Silbey and the co-authors of this paper are themselves engaged in multiple follow-up projects — partly using additional data from their survey of undergraduates — to pursue these and other questions. One paper they are working on directly compares the legal and engineering professions; another looks at relative salary differences in engineering in other fields.

Quantum Computer Built Inside a Diamond




Scientists have built a quantum computer in a diamond, the first of its kind. The chip in the image measures 3mm x 3mm, while the diamond in the center is 1mm x 1mm. (Credit: Courtesy of Delft University of Technology and UC Santa Barbara

The demonstration shows the viability of solid-state quantum computers, which — unlike earlier gas- and liquid-state systems — may represent the future of quantum computing because they can be easily scaled up in size. Current quantum computers are typically very small and — though impressive — cannot yet compete with the speed of larger, traditional computers.The multinational team included USC Professor Daniel Lidar and USC postdoctoral researcher Zhihui Wang, as well as researchers from the Delft University of Technology in the Netherlands, Iowa State University and the University of California, Santa Barbara. Their findings will be published on April 5 in Nature.

The team’s diamond quantum computer system featured two quantum bits (called “qubits”), made of subatomic particles.

As opposed to traditional computer bits, which can encode distinctly either a one or a zero, qubits can encode a one and a zero at the same time. This property, called superposition, along with the ability of quantum states to “tunnel” through energy barriers, will some day allow quantum computers to perform optimization calculations much faster than traditional computers.

Like all diamonds, the diamond used by the researchers has impurities — things other than carbon. The more impurities in a diamond, the less attractive it is as a piece of jewelry, because it makes the crystal appear cloudy.

The team, however, utilized the impurities themselves.

A rogue nitrogen nucleus became the first qubit. In a second flaw sat an electron, which became the second qubit. (Though put more accurately, the “spin” of each of these subatomic particles was used as the qubit.)

Electrons are smaller than nuclei and perform computations much more quickly, but also fall victim more quickly to “decoherence.” A qubit based on a nucleus, which is large, is much more stable but slower.

“A nucleus has a long decoherence time — in the milliseconds. You can think of it as very sluggish,” said Lidar, who holds a joint appointment with the USC Viterbi School of Engineering and the USC Dornsife College of Letters, Arts and Sciences.

Though solid-state computing systems have existed before, this was the first to incorporate decoherence protection — using microwave pulses to continually switch the direction of the electron spin rotation.

“It’s a little like time travel,” Lidar said, because switching the direction of rotation time-reverses the inconsistencies in motion as the qubits move back to their original position.

The team was able to demonstrate that their diamond-encased system does indeed operate in a quantum fashion by seeing how closely it matched “Grover’s algorithm.”

The algorithm is not new — Lov Grover of Bell Labs invented it in 1996 — but it shows the promise of quantum computing.

The test is a search of an unsorted database, akin to being told to search for a name in a phone book when you’ve only been given the phone number.

Sometimes you’d miraculously find it on the first try, other times you might have to search through the entire book to find it. If you did the search countless times, on average, you’d find the name you were looking for after searching through half of the phone book.

Mathematically, this can be expressed by saying you’d find the correct choice in X/2 tries — if X is the number of total choices you have to search through. So, with four choices total, you’ll find the correct one after two tries on average.

A quantum computer, using the properties of superposition, can find the correct choice much more quickly. The mathematics behind it are complicated, but in practical terms, a quantum computer searching through an unsorted list of four choices will find the correct choice on the first try, every time.

Though not perfect, the new computer picked the correct choice on the first try about 95 percent of the time — enough to demonstrate that it operates in a quantum fashion.

Follow

Get every new post delivered to your Inbox.