OSF DCE Application Development Guide--Introduction and Style Guide

Threads
On the other hand, if multiple threads are carrying out activities that may block—and this
includes making RPCs to remote hosts—then multithreading will probably be beneficial.
For example, multiple concurrent RPCs to several hosts may allow a local client to
achieve true parallelism. Note however, that concurrent RPCs to a single server instance
may not be any more efficient if the server itself cannot get any real benefit from
multithreading of the manager code.
RPC servers are multithreaded by default, since multithreading is an obvious way for
servers to simultaneously handle multiple calls. Even if the manager code and
underlying implementation do not permit true parallelism, manager multithreading may
at least allow a fairer distribution of processing time among competing clients. For
example, a client that makes a call that can complete in a short time may not have to wait
for a client that is using a lot of processor time to complete. For this to occur, threads
must make use of one of the time-sliced scheduling policies (including the default
policy). On the other hand, if all calls make use of approximately similar resources, then
multithreading may become simply an additional, possibly expensive, form of queueing
unless the application or the environment permits real parallelism.
In summary, the developer must consider the following questions in order to decide
whether an application will benefit from multithreading:
Are the threaded operations likely to block, for example, because they make blocking
I/O calls or RPCs? If so, then multithreading is likely to be beneficial in any
implementation or hardware environment.
Can the underlying hardware and RPC implementation support threads on more than
one processor within a single process? If not, then multithreading cannot achieve real
parallelism for processor intensive operations. The DCE user-space threads
implementation restricts all threads of a single process to contend for a single
processor and so cannot provide real parallelism for processor intensive operations.
Even if the answer to both of the first two questions is yes, will the use of a time-
slicing thread scheduling policy permit fairer distribution of server resources among
contending clients? If so, then server manager multithreading may be beneficial.
Even if, according to these criteria, multithreading is likely to benefit an application, the
programmer still needs to consider the cost, in terms of additional complexity, of writing
multithreaded code. In general, most server manager code will probably benefit from
multithreading, which is provided by default by DCE. Most server applications will
therefore choose to be multithreaded and incur the extra costs of creating thread-safe
code. Whether client code will find the extra complexity of multithreading worthwhile
really depends on a careful assessment of the listed criteria for each program design.
There is no way to predict what a ‘‘typical’’ client will do.
2.1.2 Specifying the Number of Threads
The RPC runtime allows server applications to specify the number of manager threads
available to handle concurrent RPCs via the max_calls_exec parameter of the
rpc_server_listen() routine. The runtime also allows applications to specify the number
of unhandled calls that can be queued via the max_call_requests parameters of the rpc_-
server_use_*protseq*() routines. In theory, these two values should be set in
124246 Tandem Computers Incorporated 23