Parallelism: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
|||
Line 12: | Line 12: | ||
The Streams API exposes [[Transforming Data with Java 8 Streams API#Overview|map]] and [[Java_8_Streams_API#Reduction|reduce]] operations, which then can be internally parallelized over the elements of the stream, provided that the functions that are used in mapping and reductions are [[Functional_Programming#Associative_Function|associative]], [[Functional_Programming#Stateless_Function|stateless]] and [[Functional_Programming#Non-Interfering_Function|non-interfering]]. Map-reduce is an alternative to iterative operations, which involves sharing state and does not parallelize gracefully. | The Streams API exposes [[Transforming Data with Java 8 Streams API#Overview|map]] and [[Java_8_Streams_API#Reduction|reduce]] operations, which then can be internally parallelized over the elements of the stream, provided that the functions that are used in mapping and reductions are [[Functional_Programming#Associative_Function|associative]], [[Functional_Programming#Stateless_Function|stateless]] and [[Functional_Programming#Non-Interfering_Function|non-interfering]]. Map-reduce is an alternative to iterative operations, which involves sharing state and does not parallelize gracefully. | ||
Parallelizing computation requires partitioning the input, applying transformations, then reducing the results. |
Revision as of 02:20, 29 March 2018
External
Internal
Overview
Parallelism can be exploited when the code that is executed in parallel is safe, meaning the code does not access shared data. This code is referred to as pure function.
Streams API and Parallelism
The Streams API exposes map and reduce operations, which then can be internally parallelized over the elements of the stream, provided that the functions that are used in mapping and reductions are associative, stateless and non-interfering. Map-reduce is an alternative to iterative operations, which involves sharing state and does not parallelize gracefully.
Parallelizing computation requires partitioning the input, applying transformations, then reducing the results.