Reviews a number of important algorithms, with emphasis on correctness and efficiency. The topics covered include solution of recurrence equations, sorting algorithms, selection, binary search trees and balanced-tree strategies, tree traversal, partitioning, graphs, spanning trees, shortest paths, connectivity, depth-first and breadth-first search, dynamic programming, and divide-and-conquer techniques.
This course concerns the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional net and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.
This course introduces the concepts and tools necessary for building robust web applications using the Microsoft .NET Framework. Among the topics explored will be the multi-language .NET architecture, build and deployment process, the C# programming language, Entity Framework, Model-View-Controller (MVC) design pattern, web services, Git version control, and the Azure cloud platform. This is a hands-on project-based course where students will build a fully functional web solution.
Objective of the course
Modern societies are increasingly dependent upon the proper functioning of computer systems. Hardware and software safety and security is critical in many applications of computer science (for example in cyber-physical systems such as avionics, medical devices, automotive/transportation, but also in enterprise and service software such as accounting, banking, trading, social networking, etc.).
Yet, computer programs are riddled with flaws that at best are inconvenient and annoying, and at worst, have extremely damaging consequences. The Heartbleed buffer over-read security bug in the open-source OpenSSL cryptography library is a recent example among thousands of other less advertised ones, not speaking of the yet to be discovered ones. Software of millions of lines are now common meaning thousands undiscovered potentially disastrous bugs.
This situation is specific to computer science since in other engineering disciplines engineers are penally responsible for there errors while programmers are not, and so no computer scientist is compelled to use best practices. A direct consequence is the use of cheap and ineffective practices such as debugging. ``Program testing can be used very effectively to show the presence of bugs but never to show their absence'' (E. W. Dijkstra).
In software and hardware engineering, formal methods aim at using the rigor of mathematics for the specification, development, and verification of reliable and robust software and hardware systems. Formal methods are now part of the academic and industry standards as shown for example by the DO-178 C primary document for certifying all commercial software-based aerospace systems. In particular, the DO-333 supplement to the DO-178 C discusses the use of sound formal methods as part of a software life cycle. So a firm foundation in formal methods are now an integral part of the core knowledge of every high-level professional computer scientist.
The ultimate objective of formal methods is to verify the correctness of computer systems and to have this verification completely automated, using computers. This objective is hard to achieve because of undecidability: no computer, as ``intelligent'' as it is, can correctly answer with finite time and memory resources every question relative to the correctness of an arbitrary computer system. Therefore all automated formal methods have to make compromises to escape this inevitable limitation of the power (including any form of ``intelligence'') of computer systems.
All automated formal methods must provide answers in finite time on undecidable questions and so have limitations:
A consequence of undecidability is that all automated formal verification methods will fail on infinitely many programs. Fortunately, automated formal verification methods will also succeed on infinitely many programs. By succeed, we mean that if a question about a program is answered then this answer can definitely be trusted. Not all formal methods can be trusted. For example program simulation, symbolic execution, or bounded model checking may be good at finding bugs but they are not sound formal verification methods.
It is therefore necessary to study formal methods not only as a recipe to get correct programs but also from a mathematical point of view to understand their respective principles. This is the role of abstract interpretation. Abstract interpretation originated from the need to formalize the soundness and generalize dataflow analysis and typing performed by compilers into fully automated program verification (and is often limited to this understanding). It turned out that the principles recognized by abstract interpretation where able to establish the foundations of all formal methods (preferably sound, but also to study the origin of unsoundness in incorrect ones). Abstract interpretation has also been applied to the design of the most successful static analysis tools such as Astrée.
The general idea is that to state anything formal on a computer system, it is necessary to rigorously define its semantics that is a model of all its possible behaviors in any possible execution environment. Then reasoning on the computer system is done on an abstraction of the properties of this semantics, which has to be derived, often by calculational design, from the semantics. Most often an inductive reasoning is needed, and possible thanks to abstract induction. So the study of formal methods amounts to the study of the abstractions that they are built upon. Examples of applications of the theory of abstract interpretation involve SMT solvers 1, proof methods 2, and static analysers 3;
The course will put an emphasis on the use of abstract interpretation to design proof methods and static analyzers, which now become part of the practice of companies developing huge softare (like Facebook, Google, Microsoft, Spotify, Uber, etc, see for example CodeContracts or Infer).
During the past decades, the field of High Performance Computing (HPC) has been about building supercomputers to solve some of the biggest challenges in science. HPC is where cutting edge technology (GPUs, low latency interconnects, etc.) is applied to the solution of scientific and data-driven problems. One of the key ingredients to the current success of ML is the ability to perform computations on very large amounts of training data. Today, the application of HPC techniques to ML algorithms is a fundamental driver for the progress of Artificial Intelligence. In this course, you will learn HPC techniques that are typically applied to supercomputing software, and how they are applied to obtain the maximum performance out of ML algorithms.
In this course you will learn how to develop multimedia applications for both the desktop and the web. The course focuses on general programming techniques with an emphasis on the production of interactive graphical environments. In addition, a number of advanced topics will be covered including image processing, video games, simulations, basic computer vision, augmented reality (AR), virtual reality (VR) and data visualization techniques.
This exciting and fast-evolving field of computer science has many recent consumer applications (e.g., Microsoft Kinect, Google Translate, IPhone's Siri, digital camera face detection, Netflix recommendations, Google news) and applications within the sciences and medicine (e.g., predicting protein-protein interactions, species modeling, detecting tumors, personalized medicine). Students learn the theoretical foundations and how to apply machine learning to solve new problems.
In this course, students will learn to create applications for Apple’s iOS on both the iPhone and the iPad using Objective-C, Apple's new programming language Swift, and the iOS SDK. Since its introduction, the Apple iOS SDK has proved to be a powerful platform upon which to build sophisticated applications. Without actually having to own an iPhone or an iPad, students will be able to build and test their applications on Apple Macs using the freely available compiler and simulator, XCode. Students will become proficient in the object-oriented language Objective-C, Swift, Apple iOS Frameworks, and the XCode development environment. This is a new, dynamic, constantly-evolving topic, and students will be at the forefront a new technological advancement.
Most of us have learned to program a single microprocessor (single core) using a high-level programming language like C/C++, Java, ... This is called sequential programming. We feel very comfortable with this because we think in a sequential way and give the machine statements to be executed in sequence. However, this has to change. A microprocessor with single core no longer exists in almost all computers we are using today (including your tablets and smart phones). Most of our devices are now multicore processors. A multicore processor contains several cores (called CPUs or cores) on-chip. To make the best use of these multicore chips we need to program them in-parallel. Sequential programming, for all platforms from smartphones to supercomputers, is falling out of fashion and taking back-seat to parallel programming. How to think in parallel? How to write code in parallel to make the best use of the underlying hardware? How is that new hardware different from the traditional one? What will the future be for the software and hardware? This is the topic of this course.
A practical introduction to creating modern web applications. Covers full stack web development - including topics such as database / data model design, MVC architecture, templating, handling user input, asynchronous processing, and client side interactivity. Students will use current server and client side web frameworks to build dynamic, data-driven sites. Various tools to support development will also be introduced, such as version control and build systems. Basic knowledge of HTML and CSS and familiarity with command line tools are recommended.