As presented by Shyam Pather at TechEd 2010
Data Development GPS: Guidance for Choosing with Right Data Access Technology for Your Application Today
General Overarching Guidance
First, there is no one “right way” to do data access: different technologies have evolved over time for different requirements. Choosing the right access technology is a matter of understanding how your needs map to those requirements.
Second, data access will continue to evolve, so we highly recommend that you structure your applications to accommodate that evolution. You’ll then be able to take advantage of new technologies as they come along—when it’s appropriate to make a change—with minimal impact on the rest of the application.
What’s most important, then, is not the specific data access technology you use but the patterns you implement in the application to position yourself for continued adoption of the most advanced technologies as they become available.
With that in mind, if what you’re using now works, don’t rewrite just for the sake of rewriting. Microsoft is committed both to supporting existing technologies and providing an incremental path to new technologies when you find good reasons to migrate.
Native Data Access Guidance
- For new native applications wanting a generalized abstraction layer with support for multiple data sources, use ODBC. This is the most efficient, full-featured API in which Microsoft will continue to invest.
- If you’re invested in VBA and ASP classic, continue to use ADO. ADO will continue to be supported (including security fixes) but won’t see new feature work.
- If you really want COM-based data access, use OLEDB. This will also continue to be supported but won’t have new feature work.
.NET Data Access Guidance
- New applications considering one of Microsoft’s Object/Relational Mapping technologies should start by looking at the ADO.NET Entity Framework in .NET 4(including LINQ to Entities) for data access.
- The Entity Framework was introduced with .NET 3.5 SP1; many improvements in .NET 4 and most new investments in object/relational mapping will happen in the Entity Framework going forward.
- Scenarios for which ADO.NET DataSets have been used are generally handled well with the Entity Framework also.
- The Entity Framework can be incrementally adopted in applications using ADO.NET Core. For example, much of the connect/query/process code of ADO.NET Core can be easily replaced with very simple, entry-level usage of Entity Framework. (An example is given in the Data Development GPS video from time 30:05 to 35:12.)
- For an overview of features, see Evolving the Entity Framework in .NET 4 and Beyond from PDC 2009, and Overview of the Microsoft ADO.NET Entity Framework talk from TechEd 2010.
- For more specific details, see Deep Dive into the ADO.NET Entity Framework from TechEd 2010.
- Use ADO.NET Corewhen you want the lowest level of control.
- ADO.NET Core remains the basis for data access in .NET.
- Provides the most common and familiar development patterns for data access (connect, query, process)
- DataSets and LINQ to DataSet will continue to be supported.
- For simple applications where you don’t need more than simple connections and streaming results, ADO.NET may be a good choice to get a job done quickly.
- If you have requirements for fine-grained control, ADO.NET might give you more capabilities than the Entity Framework.·
- LINQ to SQL is and will continue to be supported but will see little new investment.
Sidebar: Where should data access code go?
We often see that data access code gets spread throughout an application as the application evolves, but this makes it increasingly more difficult to maintain. The best guidance, then, is to adopt structural patterns to separate concerns:
- Model-View-Controller (MVC), Model-View-Presenter (MVP), and Model-View-ViewModel (MVVM) patterns are a good approach
- In the Model layer, hide data access behind a repository and use POCOs (plain old CLR objects) to represent your data.
- Repository pattern is hiding data access code behind an interface. This is what the Entity Framework builds for you automatically.
- Using POCOs to represent data: each object holds data but isn’t tied to a data access layer (which is abstracted by the repository interface).
WCF Data Services and OData Guidance
WCF Data Services was created because a large number of web services do little more than expose a data model in an ad-hoc manner. The technology is thus composed of the following:
- A server framework for creating REST-based data-centric web services on top of data model with appropriate security.
- A REST-based protocol called OData (the Open Data Protocol). See www.odata.org .
- Use WCF Data Services/ODatafor services that primarily expose data with few (if any) service operations; that is, you’re primarily exposing a data model.
- The data model can come from several sources: an Entity Data Model from the Entity Framework (easiest), through a Reflection Provider over CLR objects, through Data Service Provider interfaces, or through a customer implementation of the OData specification.
- Use straight WCF for services that primarily provide service operations with data being only a small consideration.
- Two primary WCF Data Services scenarios:
- Client-server designed and deployed together using a web service that exposes data; functionality surfaces in the user interface.
- Online services that expose data on the web with loosely-coupled clients and servers; state and functionality surface through the service interface.
See videos on http://msdn.microsoft.com/en-us/data/videos.aspx#dataservices .