Thank you to all who participated and attended our 2005 - 2006 CSS Speaker Series. It was a great pleasure to see so many students, alumni and community members. We will be back at the end of Summer with a whole new line up of guest speakers for our Special 10th Anniversary CSS Speaker Series for the 2006 - 2007 academic year. Please also look for information on our 10th Anniversary Celebration on October 21st at North Creek Cafe. We will be honoring our founding faculty and first graduating class. Invitations and announcements will be posted soon!
CSS Speaker Series Coordinator
Ten Myths of Rapid Development
Steve McConnell, Software Developer
The software industry commonly confuses high-energy motion with rapid meaninful progress. Many projects are developed under intense schedule pressure and are still delivered late. This talk explores 10 myths of rapid development, digs into core issues of achieveing short schedules, and explains how to lay the groundwork for truly effective software improvement. The lecture will be based on Steve McConnell's best selling books, "Rapid Development" and "Professional Software Development"
The Crayfish & The Computer
Dr. Michael Stiber, Associate Professor, CSS
Five years past the end of the "Decade of the Brain", our ability to gather information about nervous systems is truly amazing. We can now conduct experiments that generate petabytes of data imaging a human brain as a person thinks. We can record from molecule-sized ion channels in nerve cell membranes. We can modify the genetic code of test animals to custom design their nervous systems for experimental purposes. And yet, for all of our advanced in methods and knowledge about the brain, basic aspects of nervous system function still lay beyond our understanding.
In this talk, Prof. Stiber will discuss nervous systems "in the small"; single neurons and networks of a relatively few cells. While incredibly simple compared to a human brain, these small systems are the building blocks of all neural computation. The immense diversity of behaviors that these small systems exhibit; behaviors which could imply computational capabilities far in excess of that which we usually credit them. These capabilities include switching between linear and nonlinear operating modes, dynamically changing operating characteristics over time scales ranging from milliseconds to years, altering molecular or physical structure based on information encoded in the genome, responding to centrally broadcast commands, and even error correction. If the simplest components of nervous systems have such complex capabilities, what does this say for the computational power of the human brain?
Trends in Computer Graphics: The End of an Era for the GPU?
Dr. Peter Shirley, University of Utah
The major success story in special-purpose hardware is the graphics chip (GPU). For some niche applications such as large scale data visualization, parallel software ray tracers are already much faster than the most optimized GPU implementations. However, GPU programs are the only viable choice for most interactive applications. There are three clear possibilities for the future of graphics on the desktop. First is a continuation of z-buffer based GPUs. Second is an emergence of interactive ray tracing running on multicore CPUs. Third is ray tracing using custom hardware (ASIC). This talk examines trends in hardware and application data and argues that ray tracing using custom hardware is the likely winner, and outlines the research problems that will need to be overcome for such an outcome.
Security Analytics: Combating Smart Cyber Criminals
Ross Ortega, Ph.D., President, GraniteEdge Networks
Organizations house their most valuable data on computer networks providing enormous financial incentive for cyber crime. The bad guys are getting smarter and more dangerous with organized crime and terrorist groups becoming major players in internet-based attacks. They understand existing security solutions and therefore how to evade traditional defenses. The best computer crimes are those never discovered so malicious attackers strive to be unseen causing ongoing and nearly perpetual damage. All this is driving a rapid evolution in cyber-security technologies fueled by billions of dollars from companies and governments of all sizes investing in stronger defenses to protect their network infrastructure.
The new frontier of security defenses focus upon exposing zero-day or previously unknown attacks. These solutions are built upon complex algorithms, statistical analysis, and packet inspection requiring vast amounts of storage and processing capabilities. Evolving security concepts such as behavior analysis, causality analytics, fingerprinting, and forensics all rely upon cutting edge math and computer science. This talk covers some of the weaknesses of existing solutions and presents the unique approach of causality analysis as applied to network security analytics.
Ross Ortega, President and Co-Founder of GraniteEdge Networks:
Dr. Ortega has demonstrated a proficiency of identifying markets for sophisticated technologies. As a co-founder of Consystant Design Technologies, he served initially as chief technology officer and later as president and CEO through the purchase of the company's intellectual property by Intel. There, he was instrumental in raising $12.5M and establishing key relationships with Intel and Matsushita. Dr. Ortega received his Ph.D. in Computer Science from the University of Washington and a BSEE from MIT. Previously he was Acting Assistant Professor at University of Washington, Bothell in the Computing and Software Systems Department.
A Theory of Variation in Software Development, Architecture and Project Management
David Anderson, Agile Management --Keynote Lecture!
Traditional software engineering methods are built on an assumption that software engineering is deterministic and can be accurately planned in advance. Recent agile methods rebel against planning and adopt reactive adaptation to change. However, there is a better way - the introduction of a theory of variation into software engineering. By using lessons from Shewhart, Deming and Wheeler, it is possible to create predictive methods for software development, architecture and project management which embrace uncertainty and absorb change gracefully. The result is a system of quality assurance and continuous improvement for software engineering through the reduction of variation.
Parallel Job Deployment and Monitoring in a Hierarchy of Mobile Agents
Dr. Munehiro Fukuda, Assistant Professor, CSS
Grid computing can be considered as a potential application that takes full advantage of code/data mobility and navigational autonomy provided by mobile agents. To prove their applicability, Distributed Systems Laboratory at UW Bothell has been implementing the AgentTeamwork grid-computing middleware system that deploys and monitors a parallelizable job over remote computers in a hierarchy of mobile agents. Its utmost focus is to maintain high availability and dynamic load balancing of distributed computing resources allocated to a user job. For this purpose, a mobile agent is assigned to each process engaged in the same job, monitors its execution at a different machine, takes its periodical execution snapshot, moves it to a lighter loaded machine, and resumes it from the latest snapshot upon an accidental crash. The system also restores broken inter-process communication involved in the same job, using its error-recoverable socket and Java version of MPI libraries in collaboration among mobile agents.
Since the project has been carried out with CSS undergraduate research assistants, this talk presents AgentTeamwork's overview, implementation, and performance as highlighting their contribution to each implementation, and performance as highlighting their contribution to each implementation technique.