Overview of Large Point Cloud Processing
Large point clouds are data sets that contain millions to billions of 3-D points. These
data sets can require more memory than is available in most systems, making them difficult
to load and process using conventional methods. To address this challenge, Lidar Toolbox™ provides the blockedPointCloud object. The object divides a
large point cloud into smaller, spatially organized blocks, as shown in this image. The cube
with the thicker outline represents a single spatial block of the entire point cloud data
set.
The blockedPointCloud object enables you to:
Divide a point cloud into manageable blocks.
Load and process one block at a time.
Perform block-wise operations, such as feature extraction or filtering.
This block-based approach is suitable for processing large point clouds. For smaller point
clouds that fit entirely in memory, use the pointCloud object, which loads
the entire data set at once. However, you can still use block-wise processing for smaller
point clouds if you want to analyze different regions separately.
Block Processing Pipeline
Block processing of large point clouds generally follows the steps shown in this diagram.
The processing pipeline starts with the input adapter, which provides an interface for reading large point cloud data using the
blockedPointCloudobject. The object can select the required adapter based on the file format of the input point cloud.The
blockedPointCloudobject is central to the pipeline. It divides a large point cloud into smaller blocks and provides an interface for accessing each block individually. The object estimates the number of blocks based on the XYZ-limits of the input point cloud and the specified block size.After partitioning the data, you can process each block independently. You can apply a wide range of processing tasks, such as filtering, segmentation, or feature extraction.
Once you finish block processing, you can write your output data using the output adapter with the
blockedPointCloudobject.
Use of Adapters for Block Processing
Adapters provide interfaces for reading or writing point cloud data from various
sources using the blockedPointCloud object. The object supports these
standard adapters:
LAS— Enables theblockedPointCloudobject to read point cloud data from, or write processed point cloud data to, a LAS or LAZ file.InMemory— Enables theblockedPointCloudobject to read point cloud data from, or write processed point cloud data to, apointCloudobject. This adapter is useful when you want to perform block processing on a point cloud that fits in memory.LASBlocks— Enables theblockedPointCloudobject to write the processed data of each block to a separate LAS or LAZ file in a specified folder. This adapter also enables you to read point cloud data from those files.MATBlocks— Enables theblockedPointCloudobject to write the non-point cloud output, such as the mean intensity of each block, generated by processing each block to a separate MAT file in a specified folder. The adapter also enables you to read such data from those files.
You can also create your own custom adapter to read or write data from other sources.
For more information, see lidar.blocked.Adapter.
Note
The LASBlocks and MATBlocks adapters follow
a write-read process. You must first write the output files using these adapters
before you can read them back. These adapters are not intended for reading arbitrary
files that were not created using this process.
Explore blockedPointCloud Object
To process large point clouds, create a blockedPointCloud object. For
example, this code creates a blockedPointCloud object from a LAZ file,
partitioning the input point cloud into 50-by-50 blocks along the x-
and y-directions. The z-axis is not
partitioned.
pcfile = fullfile(toolboxdir("lidar"),"lidardata", ... "las","aerialLidarData.laz"); bpc = blockedPointCloud(pcfile,[50 50])
bpc = blockedPointCloud with properties: Read-only properties. Source: "C:\Program Files\MATLAB\R2026a\toolbox\lidar\lidardata\las\aerialLidarData.laz" Adapter: [1×1 lidar.blocked.LAS] SizeInBlocks: [9 6 1] XLimits: [4.2975e+05 4.3015e+05] YLimits: [3.6798e+06 3.6801e+06] ZLimits: [72.7900 125.8200] ClassUnderlying: "pointCloud" Settable properties BlockSize: [50 50 53.0300]
Notice that the Adapter property specifies the
LAS adapter because the input file format is LAZ. The
SizeInBlocks property value specifies that the object has
divided the input data into nine blocks along x-axis and six blocks
along y-axis. For more details on these and other properties, see
blockedPointCloud.
The object also enables you to find block subscripts for a specified region of interest (ROI), or the other way around. For example, this code determines the block subscripts for an ROI specified as a six-element row vector of form [xmin xmax ymin ymax zmin zmax].
idx = roi2blocksub(bpc,[429800 429825 36780020 36780050 85 100])
idx =
2 6 1To read the point cloud data for a block, use the getBlock object
function. For example, use this code to read the point cloud data for the block
corresponding to the previously specified ROI:
ptCloud = getBlock(bpc,[2 6 1]);
Block Processing
Using the blockedPointCloud object, you can apply a specified
processing function to each block. First, declare a function handle, and then use the
apply object function as shown in this code. The
OutputLocation name-value argument specifies the name of the
output folder in which to store the output data.
fun = @(block)pcdownsample(block.Data,random=0.1);
processedBlocks = apply(bpc,fun,OutputLocation="D:\results");Deep Learning Using Blocked Point Clouds
Lidar Toolbox also provides the blockedPointCloudDatastore object,
which manages a collection of point cloud blocks from one or more
blockedPointCloud objects. You can specify this datastore as input
to a deep learning network. For more information on using the
blockedPointCloudDatastore object to train a deep learning network,
see the Aerial Lidar Semantic Segmentation Using PointNet++ Deep Learning example.
See Also
blockedPointCloud | blockedPointCloudDatastore | pointCloud | pcshow
Topics
- Deep Learning with Point Clouds
- Datastores for Deep Learning (Deep Learning Toolbox)