1. Streaming the data to a collector that may perform de-duplication depending on volume and needs of the original data sets. 2. Fetching the data periodically from a remote source. 3. The collector pushing the data directly into a common data store used by the processing pipeline of the ...