This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

Import Large Data Programmatically

Import large data from relational database into MATLAB® workspace using command line

You can use a DatabaseDatastore object to import large data into the MATLAB workspace. To create this object, use the databaseDatastore function. After importing your data into the MATLAB workspace, you can use tall arrays to analyze it. Also, the MapReduce programming technique provides you with greater flexibility in analyzing large data.

To split an SQL query into multiple SQL page queries and import large data in chunks, use the splitsqlquery function. You can access large data by using Database Toolbox™ functions or a parallel pool (requires Parallel Computing Toolbox™).

Functions

expand all

databaseDatastoreDatastore for data in database
hasdataDetermine if data in DatabaseDatastore is available to read
previewReturn subset of data from DatabaseDatastore
readRead data in DatabaseDatastore
readallRead all data in DatabaseDatastore
resetReset DatabaseDatastore to initial state
createConnectionForPoolInitialize parallel pool using database connection
splitsqlquerySplit SQL query using paging

Topics

Working with Large Data Sets

Import and export large data sets that are stored in a database.

Import Large Data Using DatabaseDatastore Object

Create a DatabaseDatastore object for accessing collections of data stored in a relational database.

Analyze Large Data in Database Using Tall Arrays

Find the minimum value in a large data set by using a tall table.

Analyze Large Data in Database Using MapReduce

Write mapper and reducer functions for analyzing large data using a DatabaseDatastore object.