Development by Davis

Headlines

viernes, 30 de marzo de 2012

Development by Davis: “A distributed social workforce drives profit and performance” plus 19 more

Development by Davis: “A distributed social workforce drives profit and performance” plus 19 more


A distributed social workforce drives profit and performance

Posted: 30 Mar 2012 03:27 AM PDT

A distributed social workforce drives profit and performance

LiveOps is the only contact center (call center) leader focused on providing the full platform, applications, and talent in the cloud.  Consumers increasingly expect on-demand information and constant contact with the companies with whom they do business. These needs have prompted the co-evolution of the contact center  and fundamentally challenged its traditional hierarchical organization and branding paradigms.

read more


Rattner’s Virtual World’s Keynote: Research Reflections on IDF Day 3

Posted: 21 Sep 2007 04:09 AM PDT

Thursday, our CTO Justin Rattner gave a keynote on virtual worlds and the emergence of what he called the 3D Internet. The 3D Internet Rattner described is the mushrooming social world of multiplayer online games, of complex animations for medicine and science, of applications such Google Earth, and of virtual worlds such as Second Life. With some 60 million people now participating in virtual worlds such as Second Life, Justin said that "clearly this is not a niche."

Here's Justin in both real and virtual form.

JustinRealVirtual.jpg

One example he discussed is virtual medicine. Justin showed a video from Dr. Court Cutting, a surgeon at the New York University Medical School. He uses technology to model the facial structures, so that he can plan repairs of cleft palates in young children. However, the computational demands are very high to do this.

Moreover, to realize a true 3D Internet the demands on technology are very high. Justin characterized the requirements as:

Servers: 10-100x more computational power than more traditional online games

Clients: 3X+ the CPU power and at least 20X the graphics processing power

Networks: at least 100x the bandwidth compared to more traditional applications

This is where we expect to drive to in the next 5-10 years with tera-scale, silicon photonics, etc. Virtual worlds are perhaps the most interesting example of the model-based applications that will be enabled by tera-scale computers. As an example, Justin brought researcher Daniel Pohl on stage to show progress we've made towards enabling real time ray tracing, which renders lighting much more realistically than today's rasterization techiques.

For more info see this vid — I had a chance to discuss some of what we're doing in these areas with a colleague last week:


Tera-scale Demos at IDF

Posted: 20 Sep 2007 02:45 AM PDT

Following up on Brian's post yesterday, here's some pics and info on the Tera-scale demos we have here at IDF.

80-Core Teraflops Research Processor

ts-demo-1.jpg

Here's Paolo and Nitin, part of the team showcasing the 80-Core Teraflops Research Processor. This represents an important milestone towards enabling future processors with 10s to 100s of cores. It is the first programmable chip to deliver more than one Teraflops of performance, and consumes very little power. This prototype focuses on exploring scalable, energy–efficient designs for future multi–core chips as well as core–to–core interconnect and clocking.

Multi-Core Hardware Emulator

ts-demo-2.jpg

Franz, Elmar and Thorsten are from our Germany labs. Their emulator system enables fast, accurate emulation of future multi-core architectures. Based on FPGA technology, future designs can be prototyped with short turnaround times. It has a rich set of debug features supports analyzing platform performance vs. other designs. They showed an xperimental eight-core Intel Architecture CPU running a real OS on the emulation system.

Safer Software Execution through Log-Based Architectures

ts-demo-3.jpg

Lifeguards, software tools that proactively monitor programs for potential problems, can improve the reliability of end user systems by catching software errors at runtime. Collaboratively, Intel and Carnegie Mellon University are exploring Log-Based Architectures, new hardware enhancements designed to improve lifeguard performance on multi-core processors. Michael and Shimin demonstrated an example enhanced lifeguard detecting a computer virus attack.

Taking Applications Tera-Scale with Ct

ts-demo-4.jpg

Ct, an advanced data parallel programming environment, extends C for throughput computing to maximize programmability and performance of several applications on present and future multi-core platforms. Jane and Mohan compared conventional parallel programming and Ct, specifically for image processing and game physics applications.

Ray Traced Gaming

ts-demo-5.jpg

The ultimate goal for computer-generated graphics is to create photorealistic imagery generated on the fly. Ray tracing models the behavior of light to create shadows and reflections much better and more easily than the techniques used to render interactive 3-D graphics today. Daniel showed that the time is nearing when tera-scale computing will finally make real-time ray tracing possible.


Research Reflections on IDF – Day 2

Posted: 19 Sep 2007 09:15 AM PDT

Here at IDF – Day 2, the technology showcase is going full steam. With lunch being served in the showcase area it is a certain draw for the attendees – kind of like a massive lunch and learn. In the … Continue reading


Create Photorealistic 3D Models using Embree

Posted: 08 Aug 2011 09:20 AM PDT

Imagine being able to create a photorealistic 3D model of any object on the fly. Pre-rendered 3D models are already having a huge impact on the film and gaming industries and engineers like Manfred Ernst at Intel's Visual Computing Institute are using Embree's highly optimized ray tracing kernels to create interactive models in only a few seconds that are indistinguishable from photographs.

Come hear Manfred Ernst discuss Monte Carlo ray tracing kernels as well as an overview of the sample renderer at SIGGRAPH.

Room 216 from 10:45-11:45 AM. Wednesday, August 10th

Visit http://www.intel.com/software/siggraph for more information and to view the presentation slides.

Intel's 2nd Generation Intel® Core™ Processors are great at handling ray tracing. To get a good look at the images Embree can create, go to http://software.intel.com/en-us/articles/embree-photo-realistic-ray-tracing-kernels/. Download the source code from this page and soon you could be rendering realistic 3D images of your company's products, displaying a walk-through of an architectural model, or creating jaw-dropping images for your films, games or advertisements.

Find out more about Intel Labs' Visual Computing Institute.

Room 216 at 9:00 AM – 10:00 AM Wednesday, August 10th.

Swing by Intel Labs' Visual Computing Booth #329 and you could be chosen to win a free Intel SSD!


Intel invites you to submit a proposal for an Intel Science and Technology Center

Posted: 03 Aug 2011 12:50 AM PDT

Intel is a major sponsor of academic research all over the world. We support research in many academic fields ranging from silicon technology to cloud applications. In most of our engagements with academia we contribute more than funds and equipment, that is, we establish an open dialog and exchange of research challenges and ideas among Intel leading technologists, faculty and students.

My team, called Academic Programs and Research, is supporting academic research using several different programs. Some programs are more directed and aimed at solving a specific technology bottleneck (details by our Academic Research Office in http://techresearch.intel.com/ThemeDetails.aspx?id=10). Other research programs, have wider span and are aimed at advancing the state of the art in research areas of growing importance to the consumer and the IT industry. Recently, we have identified several such research areas: visual computing, secure computing, cloud computing, embedded computing and pervasive computing. To accelerate innovation in these areas, we have already launched in recent months four research centers, called Intel Sciences and Technology Centers (ISTCs). Each such center, is hosted in a leading US university and it includes several additional universities. Our vision is for a center to build the best research community to advance the selected research topic. This community will encompass a large group of faculty and students, thoughts leaders and researchers from Intel and, likely, additional sponsors and participants from industry and government. We want this community to work together, to ignite each other innovation and to march toward common research goals. The mix of researchers from Intel, which have hands on the leading industrial technology and knowledge of the market together with young and brilliant students and faculty from several diverse university research groups is very promising and this what we are after. Intel Science and Technology Centers are operating in an open IP model. All research results will be published and all significant software will be released open source. We believe that the openness of the IP Model is important for enabling cooperation and for achieving the maximum progress in the chosen research area to the benefit of all consumers and Intel's customers.

On Aug-3 2011, we have announced an open call for abstracts in which we are inviting faculty from US universities to submit an abstract of a proposal for a new ISTC. We are inviting the academic community in the US to help us identify additional areas of research that, on the one hand, have potential to make important impact on end-users and businesses and, on the other hand, are on the verge of technology breakthrough and new game changing capabilities.

Faculty and students involved with Intel Science and Technology Centers are gaining both additional funds to support their research and the opportunity to make an impact on the real world with their inventions. The road from good ideas to end-products is usually long and hard and many excellent technologies never make their mark on the world due to this challenge. I strongly believe that the ISTC model of collaboration between industry and academia will dramatically strengthen the pipeline of technologies from academia to industry and will benefit the US economy. I have already attended several workshops and events organized by the Secure Computing ISTC hosted by University of California, Berkeley and the Visual Computing ISTC hosted by Stanford University. The energy level, the openness, the amount of new ideas and the number of new collaborations was heart-warming.

Do not delay and submit your suggestion before 11:59pm PDT, September 2, 2011. For more details see http://intel.com/go/istc-abstract


Intel Announces Two New Intel Science and Technology Centers – Cloud and Embedded Computing

Posted: 02 Aug 2011 12:50 AM PDT

Imagine leaving the office carrying a briefcase full of work. As you enter your car, the updated family calendar is shown on the dashboard display. With anticipated free time for the new few hours before you get the kids, you stop by the shopping mall along the route home. At the entry to the mall, you pause in front of a digital sign that recognizes you and displays products of interest.

A virtual assistant engages you in a brief conversation and then directs you to the store(s) carrying the clothes and shoes you need. It also sends a 20%-discount coupon to your smart phone –another of the many bargains you have received since allowing the mall's networked-sensor system to gather information and learn about your interests.

Driving home, the vehicle taps into vehicle to vehicle (V2V) sensor network that is processed through the cloud and provides up to date routing to avoid traffic and weather/road based hazards to pick up the kids and get home safely.

When you arrive home, you are greeted by the family's robot-maid Zia who begins preparing a stir-fry dinner based on knowledge of what the family members had for lunch, what's currently in the refrigerator, and what ingredients/dinners they have enjoyed in the past. As the robot pulls ingredients from the fridge, the grocery shopping list is updated automatically. A side panel shows an up to date calendar and real-time information about the location of family members to align dinner time with their arrival. You check on the progress of dinner. Because the Zia still can't cut shiitake mushrooms proficiently, you cut them instead and explain the technique. Zia records multi-sensory input of your actions for later analysis and learning.

Fiction? Today….maybe. In the near future…not at all. Some of the technologies needed to support the above scenarios are in the nascent stage while others still need to be explored. How soon to reality? No one knows for sure but one thing we do know; They all require a tremendous amount of computing resources and open the possibility for new markets and applications for our products. But how do we get there from where we are today? Well that brings me to Intel's announcement of two new Intel Science and Technology Centers(ISTCs) which will perform research and explore areas to provide a foundation to make these scenarios a reality someday. The two centers will be focused on cloud computing and embedded computing and will be co-located at CMU. This is part of Intel's strategy of funding high impact research centers at universities. In fact, Intel has committed $100M in funding over the next 5 years to support this endeavor. With two ISTCs centers already having been announced (the Visual Computing Center located at Stanford and the Secure Computing Center located at UC Berkeley) we want to welcome their two sister centers to the fold.

Three unique features designed to increase the probability of successful collaboration are a part of the fabric of the ISTC. They are (a) an open collaborative research model which encourages all researchers under the ISTC umbrella to release their results into the public domain creating an open IP model (b) a multidisciplinary approach which means the complete platform is explored (both HW and SW) across multiple engineering disciplines creating an integrated approach to research and finally (c) the "Hands-on" involvement of Intel. Each "Hub" school will provide an academic principal investigator(PI) to work alongside a counterpart PI from Intel. Additionally, Intel labs will provide up to three researchers co-located with the center in addition to the Principal investigator to ensure close ("Hands On") collaboration with Intel as well as creating a natural technology transfer conduit when the ISTC ends and the embedded resident researchers are assimilated back into the labs on an Intel campus.

"Seeding" the Clouds

The cloud computing center focuses on enabling new paradigms to make cloud computing of the future more efficient and effective. For instance, broadcasting texting and tweeting does not require the same amount of computing power as video compression and streaming. Yet today, we provide the same amount of computing power to handle both making homogeneous based computing centers energy inefficient. We expect the amount of data handled by cloud centers of the future to only get larger. Large amounts of data require the exploration of Big Data analytics as we seek to efficiently process and stream various data content. As these data and processing centers grow, more automation will be required to facilitate IT support. With this in mind, the ISTC Cloud Computing center will have four main thrust areas:

Specialization: Contrary to the common practice of striving for homogeneous cloud deployments, clouds should embrace heterogeneity, purposely including mixes of different platforms specialized for different classes of applications. This pillar explores the use of specialization as a primary means for order of magnitude improvements in efficiency (e.g., energy), including new platform designs based on emerging technologies like non-volatile memory and specialized cores.

Automation: Automation is key to driving down the operational costs (human administration, downtime induced losses, and energy usage) of cloud computing. The scale, diversity, and unpredictability of cloud workloads increase both the need for, and the challenge of, automation. This pillar addresses cloud's particular automation challenges, focusing on order of magnitude efficiency gains from smart resource allocation/scheduling (including automated selection among specialized platforms) and greatly improved problem diagnosis capabilities.

Big Data: Cloud activities of the future will be dominated by analytics over large and growing data corpuses. This pillar addresses the critical need for cloud computing to extend beyond traditional big data usage (primarily, search) to efficiently and effectively support Big Data analytics, including the continuous ingest, integration, and exploitation of live data feeds (e.g., video or twitter).

To the Edge: Future cloud computing will extend beyond centralized (back-end) resources by encompassing billions of clients and edge devices. The sensors, actuators, and "context" provided by such devices will be among the most valuable content/resources in the cloud. This pillar explores new frameworks for edge/cloud cooperation that (i) can efficiently and effectively exploit this "physical world" content in the cloud, and (ii) enable cloud-assisted client computations, i.e., applications whose execution spans client devices, edge-local cloud resources, and core cloud resources.

The center brings together top academic minds from CMU and three other top tier US Schools(UC Berkeley, Georgia Tech, and Princeton). The academic PI is Professor Greg Ganger (CMU) while his counterpart from Intel is Principal Research Scientist Phil Gibbons. Along with them, there will be 21 academic researchers and 3 Intel embedded researchers.

Embedded Computing

The ISTC-EC center brings together thought leaders from seven different universities (CMU, Georgia Tech, UC Berkeley, University of Illinois at Urbana-Champaign, Penn State, University of Pennsylvania, and Cornell) to drive research and transform experiences in the Retail, Automotive and Home of the future. The popularity of real-time intelligent and personalized technology is growing and the demand for specialized embedded computing systems will correspondingly grow to support a broad range of new applications — many yet to still be envisioned. The ISTC Embedded Computing Center will have four main thrust areas:

Collaborative Perception: Perception in embedded applications has unique challenges as it must be performed online and in real-time in the face of limited power, memory and computational resources. The Collaborative Perception theme seeks to explore new ways to do this robustly.

Real-time Knowledge Discovery: Machine learning in embedded applications carries with it a host of unique challenges: low-power environments, multiple specialized sensing modalities, complex tradeoffs between pushing computation to the cloud or first processing data locally, and efficiently incorporating vast quantities of local/external data into local computations, etc.

Robotics: Robotic toys and vacuum cleaners are starting to inhabit our living spaces, and robotic vehicles have raced across the desert. These successes appear to foreshadow an explosion of robotic applications in our daily lives. However, without advances in robot manipulation and navigation in human environments, many promising applications will not be possible. We are interested in robots that will someday work alongside humans; at home or in the workplace.

Embedded System Architecture: The Embedded System theme aims to realize large-scale algorithms such as real-time learning and collaborative perception efficiently, given the unique power, memory and computational resource constraints of embedded systems, the particular context (physical location, proximity), as well as the domain-specific requirements.

The center will have two PIs, Priya Narasimhan (associate professor at CMU) and Mei Chen (Intel Labs Research Scientist), driving research and collaboration across the various institutions. Along with them, there will be ten leading researchers from the universities listed above along with 3 Intel embedded researchers and 2 additional embedded researchers from ECG.

The future is bright. Let's keep moving forward.


Can’t wait to create photo-realistic images for free?

Posted: 11 Jul 2011 04:44 AM PDT

After witnessing the visual computing research demos at Research at Intel Day 2011, I am excited to envision consumer shopping experience in the near future. Actually the "near" future is going to be nearer than I thought because Intel Labs announced that it will release "Embree: Photo-Realistic Ray Tracing Kernels" as open source. This will enable people (yes, literally anyone) to try out the code and use it for free if they like it.

 

Embree is a progressive photorealistic rendering system that turns 3D models into images that are virtually indistinguishable from a photograph. Viewing 3D models as images is part and parcel of daily consumer life. Online shopping, movie production and architectural visualization are very good examples where realistic rendering of 3D models is important. Turning 3D models into pictures can be done in three ways. Non-interactive, real-time and progressive methods. Briefly the differences between these methods are

- Non-interactive: Images are pre computed and stored for later viewing. (Example: Movie production)

- Real-time: Used in dynamic interactive environments where it is impossible to predict the image to render. The 3D model must be converted to images on the fly and latency cannot be tolerated in such environments (Example: Games)

- Progressive: An interactive scene can be converted to an image but slight latency can be tolerated (within seconds). A final image which is virtually indistinguishable from a photograph will be rendered in few seconds. This method can be viewed as intermediate between Offline (which takes hours or sometimes days) and real-time.

Professionals such as movie makers, architects, and car companies currently use non-interactive methods to create photo-realistic images for consumers. Software develops can use the Embree photo-realistic ray tracing kernels to improve the performance of their rendering applications by as much as 2X, accelerating the transition from non-interactive to progressive rendering.

This transition will enable completely new applications and user experiences. Imagine being able to walk thru a newly-designed building online before it is built and being able to get photo-realistic pictures from arbitrary view points within seconds, or being able to see an accurate 3D model of your new car headlights before you order it.

Read about Embree, see a demo, and download the source at: http://software.intel.com/en-us/articles/embree-photo-realistic-ray-tracing-kernels/. I can't wait to hear your experiences.


Not getting enough interactive experience in your virtual world? Check this out…

Posted: 11 Jul 2011 04:23 AM PDT

Virtual environments, the way they are designed today have limits on the numbers of users they can support at a time and do not provide immersive, rich interaction. Today's virtual environments are being pushed to a point where they can no longer be contained on a single server.

To solve the problem, Intel has created a new way to construct virtual environments.

Basically, they break down all of the important jobs that a server has to do in order to run a virtual world. For example, all of the client connections can run on one group of servers, while scripted object behaviors and physical simulation (e.g. gravity, motion and collisions) are each running somewhere else. The users then interact on what's called a "Distributed Scene Graph", the intersection of all of these various constituent parts. This compilation can scale to allow for the needs of the application or event. If you need more detailed surroundings (like the one below), you won't have to sacrifice script complexity or the number of interacting elements.

Similarly, you could populate a space with thousands of users without crashing the whole world. By separating the individual functions that compose a virtual world, each job can be allocated the proper amount of resources without taking away from the other functions.

So, if you want to throw a concert, you could set up the stage and the performers on one service provider so that the immense throngs of adoring fans won't interfere with the show. Alternatively, scientists could collaborate on complex operations using intricate models. Then, these models could be used to educate hundreds of thousands of college students in a virtual lecture hall or a programmed archeological dig.

Maybe the fact that Intel can pull off increasingly complex environments with only a fraction of the processing power for the scene doesn't impress you. If not, perhaps the fact that they are open-sourcing this new technology will. To read more about distributed scene graph, or to see a demo, or download the code for free visit http://software.intel.com/en-us/articles/scalable-virtual-environments/


Research within Intel’s Academic Centers focused on improving visual experience

Posted: 24 Jun 2011 06:03 AM PDT

Here are my thoughts on Research within Intel's Academic and Research Centers.

At this booth, I got an overview of visual computing research centers and its focus area with some cool demos. These centers create a community of researchers across US and Europe to drive visual computing research ground work for next generation visual computing applications. The two visual computing centers:

 

-

 

 

 

 

 

Intel visual Computing Institute (Leaders across Europe) at Saarland is focused on 3D internet, Bridging Real and Virtual Worlds and Scalable Rendering

Intel Scientific and Technical Computing for Visual Computing (ISTC-VC) consist of leaders in visual computing across US and is focused on content creation, real-time scalable simulations, perceiving people and places.

Among all these focus areas; the center is also looking at underlying hardware platforms that are going to allow the applications to run affectively in the next few years.

 

These two centers are building community that in combination can drive visual computing technology development in many ways to improve user experience. Intel will inject latest and greatest innovations into Intel and then go through revisions and finally find a way into products to provide the experiences the users really want!

By the click here for more information


Did you miss the livecast of Research at Intel day 2011?

Posted: 24 Jun 2011 03:25 AM PDT

Don't worry! Here are the youtube videos

Visual Computing Zone:

Personal Energy Zone:

Cloud Zone:


Intel in Automotive: Intel Labs, Integrated Platform Research: Car, Cloud and Phone

Posted: 13 Jun 2011 09:06 AM PDT

I'm thrilled to be attending the Research@Intel event where we are showcasing our automotive research. Recently auto makers have been adding more technology to their cars to deliver better experiences for their customers, and to begin to extend connected consumer's experiences into cars.

Most new cars now have support for digital media, Bluetooth and USB device connectivity; some even have internet and WiFi on-board. However, having just sampled the currently shipping systems from some of the world's leading car manufacturers, I can say firsthand that the feature lists and capabilities are very extensive and impressive, but there is room to dramatically enhance and innovate on the experiences they deliver. You can, in fact, connect your phone, play digital media, and access some connected services. All, after spending a significant amount of time connecting devices, setting up accounts and passwords, and finding controllers and buttons. Unfortunately, getting the most from these systems requires a significant learning time and reading user guides. One vehicle benchmarked had 3 system displays, 4 rotary controllers, and a touch-pad that must be used in conjunction with a multi-button rotary controller. Yikes! While I applaud the rapid integration of technology into vehicles, there is a lot of room to innovate on how these technologies are designed and integrated for the vehicle environment. The magic required is system level integration for the user's tasks at hand, user design with a purpose. What tasks are users trying to accomplish most often, while simultaneously using the vehicle? How can these be made compelling experiences and not more work for people? Intel labs Integrated Platform Research is demonstrating how to combine the computing and communications capabilities embedded in the car, with that of smart phones and vehicle-focused cloud services. Bringing them together in a simple, compelling way. The Car, Cloud and Phone experience at Research@Intel securely connects these components with one touch. It leverages the deep processing and service capabilities of the cloud for customized vehicle remote control and video surveillance. With this technology in your new car you could quickly and easily pair your smart Phone with it, even before you drove it home from the dealer. Your phone would automatically be configured specifically for your car putting its unique features and controls at your fingertips, even when you are away from your car. Our app looks familiar, reproducing the look and feel of your car's interior, key FOB buttons, displays, even the look of the dash. You can be away from your car, but stay in touch with it. Inside a restaurant ordering dinner? What happens if your car is bumped? You will instantly be notified on your phone. You'll be able to view stored and live video from your car, directly on your phone. Someone's pet bumped into your car?, No problem, enjoy your dinner. Another vehicle hit your car? No that's a problem and you can take action. This is only one example of what you could do with your car, the cloud and your phone working together. This technology is fully integrated with a production vehicle and its embedded, on-board systems. I can't wait to see people's reaction to these new in-car and remote vehicle experiences.


Research@Intel 2011

Posted: 07 Jun 2011 04:09 PM PDT

We've just finished up with the 9th annual Research@Intel press event, at which Intel Labs showcased examples of forward looking research to media and industry analysts from around the globe. Intel CTO Justin Rattner kicked off the event by stressing the importance of collaboration between Intel, academia, governments, and the broad technology industry in order to turn the ideas of today into reality. He cited Intel's collaboration with Apple to develop ThunderboltI/O technology, which had its genesis in Intel Labs as well collaborations with DARPA to develop extreme-scale computing technology 1000x more capable than what can be done today.

 

Justin also announced a new Intel Science and Technology Center (ISTC) for Secure Computing. This center will be co-led by Intel Labs and UC Berkeley and will research breakthroughs to make personal devices and data more secure. Berkeley will act as the hub of academic research for the center, which also includes researchers from University of Illinois, CMU, Drexel, and Duke. This is the second in a series of ISTCs to be announced this year – we announced the ISTC for Visual Computing, co-led by Stanford University, in January.

He also referenced the Many-core Applications Research Community, a network of more than 80 institutions testing next generation software ideas using the 48-core "Single Chip Cloud Computer," a concept chip developed at Intel Labs. The learning from these research efforts will help guide the development of future architectures which are even better suited to the needs of tomorrow's cloud datacenters.

The mood on the show floor today was exciting. I had the opportunity to give a livecast on the demos in the "cloud" zone describing projects such as a cloud-based, ray traced game running on handheld devices and a virtual city designed to for disaster response training. For the latter the innovation shown was the ability to scale the number of 'players' in such online, virtual training scenarios from hundreds to thousands. And, announced today, this code will be made available via open source for use by the OpenSim virtual world community later this month.

We also announced today that Intel Labs will open-source code for a different ray tracing project that targets offline, photorealistic rendering for uses such as design and digital effects in film. This code can provide up to a 2x speed boost for such professional applications and will also be available later this month.

In other parts of the show floor we showed an array of interesting projects: a bike powered netbook for developing economies, a "magic mirror" that allows you to see your virtual body and change it based on real human body scans, a programming system designed for exa-scale supercomputers of the future, human perception technologies, and more. Many of these projects include academic, industry and government collaborations of the type Justin emphasized in his opening address.

Looking around the event I was amazed at the diversity of projects, the number of leading minds in the room, and the level of interest in the research. And, as part of the R@I planning team I know that the ~35 demonstration on the floor represent only a fraction of the projects at Intel Labs. From here, the future looks bright.


What is Intel doing to visualize futuristic applications? Watch it live!

Posted: 07 Jun 2011 03:58 AM PDT

Have you ever wondered how Intel Labs' internal and sponsored visual computing research will improve your daily life in the future? As a technical marketing engineer in Intel Labs, I get to attend Research at Intel Day 2011. Thus I have the privilege to witness Intel Lab's futuristic visual computing research demos at Research at Intel Day 2011 on June 7th and 8th. Here is one example. Imagine shopping online for almost everything you need in daily life without guess work? Today's online shoppers have to imagine the look and feel of the actual merchandize. How would it be if you could utilize today's smart TV's to shop online while sitting in your living room couch and have a 3D view of each item and be able to try it on using the tracking camera in your living room? Technologies unveiled at this year's Research at Intel Day are going to enable a plethora of such use cases.

Looking under the covers at the online touring demo, you can see that research at the Intel Visual Computing Institute into transparent 3D internet technology will allow 3D-like realistic viewing of online content without proprietary browser plugins. Intel's latest processors provide the hardware support for very rapid rendering of realistic views of objects in the browser. However, Intel Labs didn't stop there. They extended the improved visual experience to digital content by coupling Intel's platforms with consumer electronic devices to provide a must-have user experience in our living room.

Intel realizes that it's not all about shopping. Next generation virtual environments are impacting the gaming industry and becoming a ground for realistic test scenarios. Small businesses are now generating revenue using technologies based on virtual environments. What if we can add real-time technologies such as, facial and emotion recognition etc to virtual applications? What if today's virtual world backend can support more than 20 times the avatars that are supported today in a virtual scene? This enables you to interact with virtual objects in many new and interesting ways for a much more realistic immersive virtual experience. A good example Intel Labs is showing at Research at Intel day is a massive multiplayer "game" to train first responders for different disaster scenarios. As part of its strategy to increase collaboration across the industry and the academic sectors, Intel Labs will release source code for its Distributed Scene Graph 3D Internet technology. This code is part of an ongoing effort to augment the OpenSim open-source virtual world simulator and will enable developers to build virtual regions where people can work or play online with a cast of thousands instead of being limited to less than a hundred today.

Finally, applications such as architectural visualization often require expensive propriety applications to create realistic computer models. Intel labs will release open source software which can be utilized by third parties to enable photo-realistic rendering of 3D models into 3D images that are indistinguishable from a photograph. This advanced ray tracing code targets professional applications and is a separate effort from our game-focused real-time ray tracing project shown previously.

Cool stuff, huh? Since there are over 40 demos planned, I've described just a very small sample of what will be shown to the press at Research at Intel Day.

I hope this quick peek behind the scenes has been interesting to you. As I learn more leading up to Research at Intel Days and attend the event, I will keep my thoughts coming on my blog. So stay tuned and don't forget to check back with me for the live stream of the event as I walk through the visual computing research demos. The live stream will start on June 7th, 2011 at 1PM PST. Mark your calendars!


Wolfenstein gets ray traced – on your tablet!

Posted: 07 Jun 2011 03:00 AM PDT

Since the last entry on this Research Blog about the cloud-based Wolfenstein: Ray Traced demothere have been several enhancements.

The previous setup that required four separate machines each with one Knights Ferry card (Intel MIC) inside has been changed to a configuration where the cloud is represented by having a single server with four Knights Ferry cards. We showed this setup at CeBIT 2011.

 

 

 

 

 

 

 

 

 

 

 

This time at the Research@Intel Day 2011 we extended the setup to also work on Intel-powered tablets. We are demonstrating this on the Lenovo S10-3t (10 inch) and on the Viliv S5 UMPC (5 inch). Due to the lower screen resolution a cloud setup with one machine with one Knights Ferry per client is enough to feed the tablet at a frame rate of 20-30 fps.

 

 

 

 

 

 

 

 

Cloud-based gaming approaches could lead to a situation where it doesn't matter where you are or which device you have to play your favorite game.

Further detail about the implementation and the benefits of using cloud-based ray tracing for games can be found in the paper "Experimental Cloud-based Ray Tracing Using Intel® MIC Architecture for Highly Parallel Visual Processing".

 


New Intel Science and Technology Center for Secure Computing

Posted: 07 Jun 2011 02:55 AM PDT

Let me tell you why I am really excited about the launch of the Intel Science and Technology Center for Secure Computing. The reason is: ISTC-SC is going to bat for the end-user, all of them not just the ones who are technical or motivated to actively manage their privacy and security. Behavioral studies have shown that users prefer to focus on the benefits of their devices and apps, not managing security.

The ISTC for Secure Computing center is embarking on an ambitious program to develop the technologies that seamlessly and automatically ensure the trust of the user in a variety of client and mobile devices and platforms. Simply stated – the research goal of the center can be summed by the acronym SCRUB, Secure Computing Research for User Benefit.

The center brings together top academic minds from UC Berkeley and four other top tier US schools (Carnegie-Mellon, Duke, Drexel, and University of Illinois) as well as four Intel researchers in a collaborative open IP environment. The center is co-led by Prof. David Wagner (UC-Berkeley) and John Manferdelli (Intel Senior Principal Engineer). These thought leaders supported by bright young students, have organized the SCRUB research into five thrusts, each addressing a key component of end-user security. The thrust areas are:

-

Secure clients using a "Thin Secure Intermediation Layer", providing partitioned and isolated software domains on a single device. This allows critical activities to run in a secure zone, while keeping "risky" activities in well-isolated relaxed partitions.

-

Secure mobile devices from third-party apps by developing tools to build and validate the safety of these apps. Provide a simple permissions system for users that enables safe use of third-party apps across business and personal environments.

-

Secure data storage, transmission and use that guarantee the safety of user-data regardless of where it is moved and used, while preserving the user intent. Develop a data encapsulation and safe use policy system.

-

Secure network architectures enabling scalable global monitoring of application semantics, while assuring data confidentiality for the individual user.

-

Secure analytics by learning how to detect adversarial activity, and develop models and tools for measuring, tracking and analyzing the flow of information

An important outcome from this effort will be to bring together a community of academics in partnership with industry colleagues to nurture the growth of critical thinking in the secure computing arena. This means publishing research results, growing the next generation of technical leaders by training the next generation of students, and providing the basis for new business opportunities for new products and services in ways not yet imagined. Not only has Intel provided funding for the center but it has significantly simplified research collaborations and amplified the potential impact of the center by encouraging an open IP model.

I believe that the ISTC-SC research thrusts together form the key components of security and trust concerns on end-user clients and mobile devices. I am looking to the day when I can install an app, manage financial records, and communicate with friends and business colleagues from my home client devices to the smartphone, all safely. Yes, I am one of many users in a digital future who stand to benefit from the research of the new ISTC on Secure Computing.


Intel technologist ACEs tech award

Posted: 04 May 2011 01:00 AM PDT

I'm writing today to congratulate Intel Fellow Dr. Mario Paniccia for receiving EE Times' "ACE" award for Innovator of the Year last night in San Jose. These awards celebrate technologists who demonstrate leadership and innovation to change the electronics industry and the world. He won this for leading the research team that developed the 50Gbps Silicon Photonics Link, a concept fiber-optic connection designed to validate Mario's vision to 'siliconize' photonics.

This device represents a tremendous potential to bring fiber optics to the mainstream. For personal devices, it could become practical to have connections that could backup an entire hard drives or transfer entire music libraries between devices in seconds. A single cable could carry video to a wall-sized HD display with a resolution equal to an array of ten or more 1080p screens. For the enterprise, pervasive fiber optics could eliminate traditional design constraints due to the distance and bandwidth of data cables. Entirely new architectures could be created which rearrange CPUs, memory, and other devices in a much more scalable fashion.

ace3.jpg

Though this is his first ACE award, it is actually the second time Mario has been nominated The first was in 2006, following the development of first 1 GHz silicon modulator and silicon-based laser. Having worked closely with Mario for many years, I'd like to share some of the history that led to this new achievement.

Before the turn of the millennium, silicon photonics remained an unlikely candidate for fiber optics. Silicon lacked many of the capabilities required for optical communication. Intel's own Silicon Photonics research has its roots in a debug tool developed by Mario in the late 1990s called the Laser Voltage Probe. The "LVP" became a standard tool to allow debug engineers to measure transistor signals on microprocessors by probing through the back side of the silicon with an infrared laser beam.

About a year or so later, Mario pitched an idea to Intel senior management for a silicon-based telecom optical switch, based on the same effect exploited by the LVP. After a few years of research it became apparent that the best use of this switching technology would be to create a fast modulator (an optical data encoder), which did not exist yet in silicon above a mere 20 MHz. The initial goal was ~1 GHz. Amid much skepticism, the team not only met the goal, they exceeded it. Today, silicon modulators at Intel Labs can send data at rates of 40 Gbps.

Mario's vision included three phases of research and development: I) prove feasibility with demonstration of optical building blocks, then II) move to integration, and finally III) high volume manufacturability. For the middle part of the past decade, his team focused on phase I, developing a variety fundamental building blocks such as photodetectors, lasers, and modulators, and then scaling their performance to higher speeds.

Soon after, the team shifted focus from the development of building blocks to phase II: integration. The 50 Gbps silicon photonics link is the result of that effort, and required Mario's team to put devices with different process recipes onto the same piece of silicon, assemble them on to circuit boards, and connect them to optical fibers to create an end-to-end link. Read more about the device here.

Using silicon allows one to benefit from the decades of high volume manufacturing infrastructure developed for silicon integrated circuits such as microprocessors. We've seen what silicon integration can do for electronics, and I believe we will get similar benefits from silicon photonics.

Mario is now leading the team into phase III: Tackling the remaining challenges that could stand in the way of high-volume production of these devices. This includes researching the issues of making reliable integrated devices economically on a large scale, as well as continuing to increase levels of integration, performance, and scalability. 50Gbps is just a beginning – Mario expects to see 1 Terabit/s coming out of a single silicon chip in the near future.

So, again, I congratulate Mario and his team for this well-earned award. He has helped to take silicon photonics from a technology that few believed in, to one we can now see approaching on the horizon for a variety of applications.


Future Lab: Intel Science Talent Search 2011

Posted: 28 Mar 2011 08:41 AM PDT

Where do the inventors and researchers of the future come from? They are in high school right now! This year's Intel Science Talent Search brought together kids from across the country who are doing research that will knock your socks off! We spoke with 3 finalists who are investigating such topics as formation of protein shapes in the human body, long-term impact of environmental cleanup using Organoclay, and alternative methods for approximating the square root of integers.

It is inspiring to see the enthusiasm and amazing work that these kids are bringing to their various fields of study.


Greener IT, one core at a time…

Posted: 16 Mar 2011 01:00 AM PDT

I'm happy to share today that the government of Germany has presented Intel with an award for the development of our 48-core concept vehicle, the Single Chip Cloud Computer. The research chip won a German Innovation Prize for Climate and the Environment in the category "Environmentally Friendly Technologies." The German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety and the Federation of German Industries presents these awards each year to acknowledge innovations that protect the climate and the environment. Franz Olbrich, from the lab in Germany which co-led the design of the SCC, traveled to Berlin to accept the award today on behalf of the company.

Developing 'green IT' is an essential focus for Intel researchers, and the fact that Intel is one of largest purchasers of renewable energy shows how important resource conservation and sustainability is for the company's operations. Information technology now causes carbon emission comparable to that of civil aviation. At first a many-core processor might seem like an unlikely candidate to address these issues. Doesn't more computation mean more energy spent?

The answer is: not if you spend that energy wisely. The SCC, along with our entire Tera-scale computing research program, exists as a part of a larger effort to do more with less, computationally. This is because breaking down processing tasks into more and more parallel elements, running on streamlined cores, is a more energy-efficient approach than making chips run at faster clock speeds. That is, provided that this parallel computing can be done effectively. The SCC prototypes a variety of research techniques to make this division of work more efficient so that the energy benefits of many-core can be more fully realized.

In addition to raw parallelism, the SCC incorporates fine-grain power management features that allow software applications to determine how much energy is needed at a given time and for a given task. Clock frequencies and voltages can be set to different levels across the chip depending on application needs. Cores or even entire banks of cores can sleep and wake as needed. The 48 cores require just 25 watts in idle mode or 125 watts when running at maximum performance, which is comparable to the consumption of two standard household light bulbs. Imagine applying this kind of capability across large systems such as data centers, and you can see how the power savings would multiply.

The SCC has also been shared with worldwide research partners through Intel's recently launched Many-core Applications Research Community (MARC), a program aimed at spurring innovations in highly parallel software. More than 100 teams are conducting research on programming models, operating systems, development tools and programming languages for both microprocessors and data centers of the future.

As a final note – in accepting the award Franz also announced today that Intel plans to match the 25,000 Euro prize and donate it to a scholarship program which sponsors talented, high-profile students in Germany.


Future Lab: Cloud Gaming

Posted: 01 Mar 2011 03:02 AM PST

Utilizing the resources of cloud based gaming opens up the possibility of even more advanced graphics that couldn't be done on a single machine today. Researchers at Intel are working with university collaborators and game developers to explore real-time ray tracing to show new special effects for games. Notebooks, netbooks and tablets get access to playing high-end games. Find out more on this episode of Future Lab.


No hay comentarios:

Publicar un comentario