0% found this document useful (0 votes)
111 views

Datastream Formula PDF

This document provides an A-Z listing of popular financial indicators, adjustments, and calculations that can be used in Datastream expressions. It includes over 30 individual definitions organized alphabetically. Navigation instructions are provided at the top for accessing specific sections within the document using keyboard shortcuts. Additional helpful hints for Datastream expressions and functions are also included at the end.

Uploaded by

Nur Aaina Aqilah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
111 views

Datastream Formula PDF

This document provides an A-Z listing of popular financial indicators, adjustments, and calculations that can be used in Datastream expressions. It includes over 30 individual definitions organized alphabetically. Navigation instructions are provided at the top for accessing specific sections within the document using keyboard shortcuts. Additional helpful hints for Datastream expressions and functions are also included at the end.

Uploaded by

Nur Aaina Aqilah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

Notes on using this document – either use the Navigation pane on left – or the control key and the

required section in the table of contents


below (and the control and home keys to return to the top)

See also the Datastream functions help


http://product.datastream.com/navigator/AdvanceHelpFiles/Functions/WebHelp/HFUNC.htm

A-Z of Datastream expressions for popular indicators, adjustments and calculations ............................................................................................ 2
Advance-Decline ratio and line ..................................................................................................................................................................... 2
Annualized growth rate / compound annual growth rate (CAGR) .................................................................................................................. 3
Arms Ease of Movement Index ..................................................................................................................................................................... 3
Beta .............................................................................................................................................................................................................. 3
Bollinger bands ............................................................................................................................................................................................. 4
Buy-and-hold abnormal returns (BHAR)........................................................................................................................................................ 4
Chaikin money flow....................................................................................................................................................................................... 5
Coppock indicator ......................................................................................................................................................................................... 5
Correlation .................................................................................................................................................................................................... 5
Correlation coefficient significance ................................................................................................................................................................ 5
Decumulated values ..................................................................................................................................................................................... 6
Earnings yield ............................................................................................................................................................................................... 6
Enterprise value............................................................................................................................................................................................ 6
Exponential weighted average ...................................................................................................................................................................... 7
Graham & Dodd PE ...................................................................................................................................................................................... 7
Herfindahl index............................................................................................................................................................................................ 7
Implied volatility ............................................................................................................................................................................................ 8
K and D Stochastics ..................................................................................................................................................................................... 8
MACD........................................................................................................................................................................................................... 9
Maximum Drawdown and the Underwater measure .................................................................................................................................... 10
On balance volume ..................................................................................................................................................................................... 10
Oscillator .................................................................................................................................................................................................... 10
PEG ratio .................................................................................................................................................................................................... 10
Performance relative to benchmark ............................................................................................................................................................ 11
R square ..................................................................................................................................................................................................... 11
Rebasing .................................................................................................................................................................................................... 12
Regression standard error and t-value ........................................................................................................................................................ 12
Relative strength index ............................................................................................................................................................................... 13
Revisions ratio I/B/E/S ................................................................................................................................................................................ 13
Risk premium.............................................................................................................................................................................................. 14
Sharpe ratio ................................................................................................................................................................................................ 14
Shiller PE ratio or CAPE ............................................................................................................................................................................. 15
Tobin’s Q .................................................................................................................................................................................................... 15
Turaku index ............................................................................................................................................................................................... 15
Volatility ...................................................................................................................................................................................................... 16
Volume weighted average price (WVAP) .................................................................................................................................................... 17
Weighted average cost of capital (WACC) .................................................................................................................................................. 17
Williams %R ............................................................................................................................................................................................... 18
Williams variable accumulation ................................................................................................................................................................... 18
Z-score (Altman) ......................................................................................................................................................................................... 18
Z-score (Statistical) ..................................................................................................................................................................................... 20
Other helpful hints for Datastream expressions and functions ............................................................................................................................. 21
Percentage of stocks trading above/below a high/low or moving average ................................................................................................... 21
Dummy time series for use in expressions .................................................................................................................................................. 22
Holidays ..................................................................................................................................................................................................... 24
IF# function and conditional tests ................................................................................................................................................................ 24
Index calculation from percentage change series ....................................................................................................................................... 25
Inflation (removal of) ................................................................................................................................................................................... 25
Limits for expressions and AFO/DFO requests ........................................................................................................................................... 26
LIST# expressions ...................................................................................................................................................................................... 28
Unpadding values or switching off carry forward ......................................................................................................................................... 29

Document Title
Document Version 0.1
Risk free interest rates ................................................................................................................................................................................ 30
Units, scale and currency ................................................................................................................................................................................... 32
Converting to a different currency ............................................................................................................................................................... 32
Decimal places ........................................................................................................................................................................................... 33
How to determine the units/scale for a datatype.......................................................................................................................................... 33
Volume definitions ...................................................................................................................................................................................... 35
DFO and Datastream Charting workflow tips ...................................................................................................................................................... 37
7 day week data retrieval in DFO ................................................................................................................................................................ 37
AFO to DFO conversion tips & troubleshooting ........................................................................................................................................... 38
Additional DSGRID parameters .................................................................................................................................................................. 40
Date alignment for economics series .......................................................................................................................................................... 40
Drawing a custom trend line from an equation ............................................................................................................................................ 42
Half-year frequency .................................................................................................................................................................................... 44
Line of best fit / trend line ............................................................................................................................................................................ 45
Not displaying dates where values are #NA in DFO 3.0 .............................................................................................................................. 47
Removing capitalization in AFO/DFO output ............................................................................................................................................... 48
Using SEDOL and CUSIP identifiers ........................................................................................................................................................... 48
Using the DS.SYMBOLLOOKUP datatype to find the best matching Datastream series from a series description ...................................... 49
Using VBA commands to customize your sheets and inhouse time series uploads - Datastream For Office (DFO) 3.0 ...................................... 50
Refresh methods ........................................................................................................................................................................................ 50
Methods to display Navigator with a specified category .............................................................................................................................. 50
Methods to control the User Created Time Series (UCTS) template sheet .................................................................................................. 51

A-Z of Datastream expressions for popular indicators,


adjustments and calculations
Advance-Decline ratio and line

The Advance-Decline Ratio divides the number of stocks that closed higher than the previous day (advances) by the number of
stocks that closed lower than the previous day (declines).

If datatypes RS (Number of Rises) and FS (Number of Falls) are available for an index, then the ratio can be calculated as
follows, for example:

TOTMKUK(RS)/TOTMKUK(FS)*1.000

However, if RS and FS are unavailable for a particular index, the best solution would be to calculate a User Created Index (UCI)
from the index constituent list - as RS and FS can be selected as datatypes to be calculated.

The related concept of the Advance-Decline Line is a measure of broad market strength, and the line tends to peak well ahead
of the more widely followed market indices and averages. It is calculated as the cumulative difference between the advancing
and declining stocks each day.

• When the AD line is positive (line moves up), more stocks are advancing than declining.
• When the AD line is negative (line moves down), more stocks are declining than advancing.

For example:

CSUM#(TOTMKUK(RS))-CSUM#(TOTMKUK(FS))

Again, if RS and FS are not directly available (e.g. for the S&P 500 index) a solution is to calculate a UCI and select these two
datatypes from the choices to be calculated.
Annualized growth rate / compound annual growth rate (CAGR)

The GRFL# function is normally used to calculate the annualised growth rate, also known as the Compound Annual Growth
Rate (CAGR) between two values/dates, using a trend line between the first and last value. GRLS# is also available, which
calculates the annualised growth rate using a least squares trend line - but is less commonly used.

For example, GRFL#(FTSE100,1M) calculates the annualised 1 month compound growth rate for the FTSE 100 index, and
GRFL#(UKGDP...D,1Q) calculates the annualised quarterly compound growth rate for UK GDP.

Arms Ease of Movement Index

The Arms Ease of Movement Index emphasizes days in which the stock moves easily and minimizes the days in which the
stock finds it difficult to move. The indicator is used frequently with equivolume charts to identify market formations. A buy signal
is generated when the index crosses above zero; a sell signal is generated when it crosses below zero. When the index hovers
around zero, there are small price movements and/or high volume i.e. the price does not move easily.

057E "ARMS EASE OF MOVEMENT INDEX"

(((X(PH)+X(PL))/2)-((LAG#(X(PH),1D)+LAG#(X(PL),1D))/2))/(X(VO)/(X(PH)-X(PL)))

The general formula for the calculation is:

Beta

The most common measure of historical beta uses a rolling 5 year time period of monthly frequency logarithmic changes - this is
available as global expression 897E "DS HISTORICAL BETA LOCAL INDEX" and this method uses a benchmark Local Index
(LI) for the respective stock.

REGB#(LN#(X(LI)/LAG#(X(LI),1M)),LN#(X/LAG#(X,1M)),60M)

Alternatively, a similar global expression 851E "DS HISTORICAL BETA (5 YEARS)" is available, where X is an index of your
choice, and Y is the stock. This can be used in a monthly frequency time series request as 851E(FTSE100,VOD) for example.

REGB#(LN#(X/LAG#(X,1M)),LN#(Y/LAG#(Y,1M)),60M)
The expression can be adapted for different time periods and frequencies if preferred - for example, 845E "DS HISTORICAL
BETA (2 YEARS)" uses two years (104 weeks) of weekly frequency observations:

REGB#(LN#(X/LAG#(X,1W)),LN#(Y/LAG#(Y,1W)),104W)

And the following uses 1 year (252 trading days) of daily frequency observations:

REGB#(LN#(X/LAG#(X,1D)),LN#(Y/LAG#(Y,1D)),252D)

Further alternatives and examples are also available in the Expression Picker, folder Datastream > Risk and Volatility > Beta
Calculations.

Bollinger bands

Bollinger Bands provide a relative definition of high and low in the sense that prices are considered high at the upper band and
low at the lower band. They consist of 3 lines drawn in relation to an instrument's price - the middle band usually being a simple
moving average while the upper and lower bands are usually 2 standard deviations either side.

Values can be calculated using the following global expressions for example, which use a 20 day moving time period:

113E "UPPER BOLLINGER BAND"

MAV#(X,20D)+(2*SDN#(X,20D))

114E "LOWER BOLLINGER BAND"

MAV#(X,20D)-(2*SDN#(X,20D))

An example is available in Datastream Charting - see the following section of the Library:

Datastream > Sample Charts > Technical Charts > Example Templates > Bollinger Bands & MACD.

This template is also available in the Chart Studies app in EIKON, and the Studies page from the EQUITY menu in Datastream
Professional. See the Datastream > Equities > Technical folder.

Buy-and-hold abnormal returns (BHAR)

The term BHAR is taken from Event Study literature and is generally defined as the difference between actual and expected
returns, under a buy-and-hold long investment strategy.

The global expression 481E is a (static) long term performance measure, which measures the excess compounded percentage
return for a financial instrument, relative to a reference or benchmark instrument between two given dates. For both instruments,
all dividends are assumed to be reinvested over time.

Take two dates A and Z, with A>Z. The measure is defined as:

481E "BHAR STATIC"

(VAL#(X(RI),A)/VAL#(X(RI),Z)-VAL#(Y(RI),A)/VAL#(Y(RI),Z))*100

The following measure is a continuous version of the BHAR measure; the parameter Z is the date for which the measure is
supposed to be "started" and have the value 0:

482E "BHAR CONTINUOUS"

REB#(X(RI),Z)-REB#(Y(RI),Z)

Examples:
481E(MKS,FTSE100,-5Y,0D) calculates the static BHAR for the Marks & Spencer equity relative to the benchmark index FTSE
100, over the latest 5 year time period.

482E(MKS,FTSE100,31/12/99) calculates continuously the performance in terms of compounded returns for an investment
made at end of December 1999 in the Marks & Spencer equity relative to the FTSE 100.

Chaikin money flow

The 20-day period (for example) Chaikin Money Flow for an equity, which uses the daily closing price along with datatypes PH
(Intraday Price High), PL (Intraday Price Low) and VO (Volume traded) can be written as:

SUM#((X-X(PL)-(X(PH)-X)/(X(PH)-X(PL)))*X(VO),20D)/SUM#(X(VO),20D)

Coppock indicator

The Coppock Indicator was developed by Edwin Coppock, a US investment advisor. It is a momentum oscillator that was
designed for long term investors to begin accumulation at the beginning of a bull market. The Coppock Indicator signals the
beginning of a bull market when it crosses above the zero line. This signal is usually after the first leg of a bull market is
underway, thus it’s highly reliable.

It is available under the following global expression:

052E "COPPOCK INDICATOR"

WAV#(PCH#(FTSE100,14M)+PCH#(FTSE100,11M),10M)

Correlation

Correlation indicates the extent to which two variables move in the same direction, i.e. increase or decrease. The highest value
for correlation is 1 (positive correlation), meaning that they always move in the same direction, either both upwards or both
downwards; the minimum value is minus 1 (inverse correlation), indicating that whenever one variable increases in value, the
other decreases in value.

Correlation can be calculated using the CORR# function - see the "Statistical" folder of the Function definition help pages.

In the following global expression, X is the equity and X(LI) is used to the automatically select the appropriate local benchmark
index for the equity, and logarithmic monthly changes over a rolling 5 year period (60 months) are used for the calculation.

841E "HISTORICAL CORRELATION 5 YEAR"

CORR#(LN#(X(LI)/LAG#(X(LI),1M)),LN#(X/LAG#(X,1M)),60M)

Related correlation examples are also available in the Expression Picker, folder Datastream > Correlation and Regression.

There is also a specialised correlation tool available in Datastream Charting, for identifying the correlation between lines, and
the lag/lead period that displays the strongest correlation.

Correlation coefficient significance

This tests whether the value of a correlation coefficient between two random variables (which must be between -1 and 1) is
statistically significant from zero. This can be tested using the t-distributed statistic:

This tests whether the value of a correlation coefficient between two random variables (which must be between -1 and 1) is
statistically significant from zero. This can be tested using the t-distributed statistic:
having n-2 degrees of freedom, where n is the number of observations (given some assumptions about the bivariate probability
distribution).

An assumption, which is sufficient for this result, is that the two data series constitute a random sample from a bivariate normal
distribution – more details are available in standard statistical textbooks or online etc.

Generally values over 3 are regarded as very significant (to confirm precisely, consult a statistical table showing the value
against number of observations, available in standard textbooks or web sites)

Decumulated values

Some economics series are published by sources on a cumulative basis - i.e. for each observation period, the latest value is
added to the sum of the previous values.

These cumulative series are marked with the abbreviation (CMLV) in Navigator - for example the quarterly China economics
series CHPERDISA "PER CAPITA DISPOSABLE INCOME, URBAN HOUSEHOLDS (CMLV)".

If you would like to retrieve decumulated values instead of the cumulative sum, there are two global expression available -
depending on whether the series is monthly or quarterly.

310E "DECUMULATIVE QUARTER"

X*IF#(DUMMYQT1,GT,ZERO)+NA#(X-LAG#(X,1Q))*IF#(DUMMYQT1,EQ,ZERO)

311E "DECUMULATIVE MONTH"

X*IF#(DUMMYJAN,GT,ZERO)+NA#(X-LAG#(X,1M))*IF#(DUMMYJAN,EQ,ZERO)

So for example, 310E(CHPERDISA) returns the decumulated quarterly values for this series.

Example charts which illustrate the use of these expressions are available in the Datastream Charting Library, folder:

Datastream > Charting Features > Analytics > Functions and Expressions

Earnings yield

Earnings Yield is simply the reciprocal of the P/E ratio (datatype PE, which is available for equities and most equity indices), so
this can be calculated in decimal form as 1/X(PE) or in percentage form as 100/X(PE)

So for example the Earnings Yield for the FTSE All Share can be calculated in % form as 100/FTALLSH(PE)

Enterprise value

Enterprise Value is a popular measure of a company's total value, calculated as the sum of:

Market Value of Equity + Debt - Cash (i.e. Net Debt) + Preferred Shares + Minority Interest

It is available as an annual frequency financial statement item under Worldscope datatype WC18100, or on a last 12 months
basis under Datastream Worldscope dataype DWEV, which also takes interim financial statements into account.

Alternatively, there are a number of Global Expressions available, which use daily frequency Market Value of Equity data in
order to calculate more timely latest values, and higher frequency history.

See the following folder in the Expression Picker, which includes expression based on standard MV values along with others
which use datatype MVC (Market Value - Consolidated, for dual listed companies), and others which use interim Worldscope
values rather than annual.

Datastream > Fundamentals > Enterprise Value


Also see the following folder, which offers expressions for calculating I/B/E/S estimate valuation ratios with Enterprise Value as
the numerator:

Datastream > Ratios > Estimates Ratios

In addition, EV valuation ratio templates are available in Datastream Charting - see the following section of the Library:

Templates > Datastream > Equities > Valuation

These templates are also available in the Chart Studies app in EIKON. See the Datastream > Equities > Valuation folder.

Exponential weighted average

This type of average calculation achieves a smoothing effect on historical values by applying a weighting factor. It can be
calculated using the EWA# function - see the "Average" folder of the Functions help and definitions pages for further information
and examples.

Whereas the MAV# (Moving Average) function gives equal weight to each data point, the EWA# function gives different relative
weights to latest values vs. earlier values, as determined by the weighting factor.

The weighting factor must be between 0 and 1; the lower the weighting, the greater the smoothing effect.
The factor can be calculated from a given time period using the formula:

Weighting Factor = 2 / (days + 1)

So e.g. for a 26 day exponential weighted average, the weighting factor would be calculated as:

2 / 27 = 0.074

A popular technical analysis application of this indictor (MACD) is to calculate the difference between a 12 day (2/13=0.154) and
26 day (2/27=0.074) EWA:

EWA#(X,0.154)-EWA#(X,0.074). This is the global expression 053E.

The output value from this expression provides a dynamic indication of a change in trend. When the value moves from negative
to positive, i.e. the shorter-term moving average (the first EWA# function, for 12 days) rises above the longer-term moving
average (the second EWA# function, for 26 days), it is seen as a bullish indication. Conversely, a change from positive to
negative indicates a bearish outlook.

Graham & Dodd PE

In order to reduce the impact of the cyclical nature of profits, Graham and Dodd argued that PE ratios should compare stock
prices to earnings that have been averaged over not less than five years, and preferably seven to ten years.

For example, to use a ten-year moving average of historical earnings for a stock or an index, the expression for the PE ratio is
the following, which is available as a Global Expression.

038E "GRAHAM & DODD PE RATIO 10Y"

X/MAV#(X/X(PE),10Y)

Adjustment for inflation can also be made, by dividing the price by the relevant consumer price index. This is commonly known
as the Cyclically Adjusted PE (CAPE), or equivalently the Shiller PE (after the economist Robert Shiller). For the US the Shiller
PE is USSPRPER.

Herfindahl index

Also known as Herfindahl–Hirschman Index (HHI), this indicator is a measure of the competition between firms within an
industry, and is calculated as the cross-sectional sum of the respective squared market share of each firm within that industry.
Net Sales or Revenues (Worldscope datatypes WC01001 or DWSL) could be used as the measure of market share for
example, and the LIST# function can be used with an industry constituent list to calculate the cross-sectional sum.

It must be calculated in two stages, as multiple LIST# functions cannot currently be combined in a single expression. Also note
that the LIST# function can only be used in time series requests, not static requests. For further information about the LIST#
function, see the "Statistical" folder of the Function help and definition pages.

For example, using the constituent list for the Datastream index of US Internet companies LINTNTUS:

LIST#(LINTNTUS,X(DWSL),SUM)

gives the industry sum of net revenues in thousands of US dollars, e.g. 176560971.0

Then:

LIST#(LINTNTUS,POW#(X(DWSL)/176560971.0,2)*1.000,SUM)

calculates the Herfindahl measure, 0.34 in this example i.e. 34% (at the time of writing).

Implied volatility

Implied volatility is available for instruments which have traded options. It shows the market expectation of annualised volatility
over the life of the option contract, as opposed to standard historical volatility of returns.

Search 'volatility' within the Options category of the Datatypes section of Navigator, to view the available datatypes.

For example:

ESXC.SERIESC(VI)

The implied volatility of the FTSE100 index, based on call options and the near-term contract.

K and D Stochastics

The stochastic oscillator is a technical analysis oscillator (or two oscillators) showing the latest closing price in relation to the
trading range of the past N days.

Two oscillator lines are calculated, called %K and %D, each ranging from 0 to 100. %K is the closing price within the past N-
days time period, and ranges from 0 when the latest close is a new N-day low, up to 100 for a new N-day high. %D is a simple
moving average of %K. The "N" value can be varied - in our global expressions below we have used 5 days (5D) for %K, and 3
weeks (3W) for the %D moving average.

There is also a concept of "fast" and "slow" stochastics for this indicator. The "slow" concept applies an additional moving
average (e.g. 3 weeks again, in our global expressions below) compared to the "fast" concept, and is hence regarded as being
less sensitive to changes in the price, and less likely generate transaction signals.

Levels near the extremes 100 and 0, for either %K or %D, indicate strength or weakness (respectively) with prices making or
approaching new N-day highs or lows. Levels above 80 and below 20 can therefore be interpreted as overbought or oversold.
The %D indicator acts as a trigger or signal line for %K. A buy signal is given when %K crosses up through %D, or a sell signal
when it crosses down through %D.

071E "STOCHASTICS %K (FAST)"

((X-MIN#(X(PL),5D))/(MAX#(X(PH),5D)-MIN#(X(PL),5D)))*100

055E "STOCHASTICS %K (SLOW)"


MAV#(((X-MIN#(X(PL),5D))/(MAX#(X(PH),5D)-MIN#(X(PL),5D))*100),3W)

072E "STOCHASTICS %D (FAST)"

MAV#(((X-MIN#(X(PL),5D))/(MAX#(X(PH),5D)-MIN#(X(PL),5D)))*100,3W)

056E "STOCHASTICS %D (SLOW)"

MAV#(MAV#(((X-MIN#(X(PL),5D))/(MAX#(X(PH),5D)-MIN#(X(PL),5D))*100),3W),3W)

See the following folder in the Expression Picker, where these expressions and other Technical Analysis indicators can be
found:

Datastream > Technical Indicators

In addition, Stochastics examples are available in Datastream Charting - see the following sections of the Library:

Charts > Datastream > Sample Charts > Technical Charts

and

Templates > Datastream > Equities > Technical

Similarly, a template is available in the Chart Studies app in EIKON. See the Datastream > Equities > Technical folder.

MACD

The Moving Average Convergence/Divergence indicator (MACD) is calculated by subtracting the value of a longer term
(normally 26 days) exponential weighted average from a shorter term (normally 12 days) exponential weighted average.

A trigger (or signal) line is then typically calculated using a 9 day exponential weighted average of the MACD.

The basic MACD trading rule is to sell when the MACD falls below its trigger line, or conversely buy when it rises above.

053E "MACD"

EWA#(X,0.154)-EWA#(X,0.074)

054E "MACD - TRIGGER"

EWA#(053E,0.20)

For further information on the EWA# function, see the entry for Exponential Weighted Average.

The above global expressions can be found in the following folder of the Expression Picker, along with a variety of other
Technical Analysis indicators:

Datastream > Technical Indicators

In addition, MACD examples are available in Datastream Charting - see the following sections of the Library:

Charts > Datastream > Sample Charts > Technical Charts

and

Templates > Datastream > Equities > Technical

Similarly, a template is available in the Chart Studies app in EIKON, and the Studies page from the EQUITY menu
in Datastream Professional. See the Datastream > Equities > Technical folder.
Maximum Drawdown and the Underwater measure

Underwater measure - this calculates the return/loss from selling on a particular date, after buying at the highest peak during the
time period. This can be calculated, in a time series chart or request, as (X/MAXG#(X)-1)*100.00 at for example daily data
frequency.

Maximum Drawdown calculates the maximum loss that would be suffered from an investment when buying at the highest peak
during a particular time period, and selling at the lowest subsequent trough in the same time period (i.e. the investment is
bought before it is sold). The expression at daily data frequency could be MIN#((X/MAXG#(X)-1)*100.00)

On balance volume

On Balance Volume is a momentum indicator that measures positive and negative volume flow. If today’s close is greater than
yesterday’s close, then today’s volume is added to yesterday’s OBV; vice-versa if today’s close is less than yesterday’s close. A
rise in this indicator can be seen as confirmation that the price is in an uptrend.

058E "ON BALANCE VOLUME (DAILY)"

CSUM#((ACH#(X,1D))/ABS#(ACH#(X,1D))*X(VO))

Oscillator

The Oscillator indicator is applicable to series where high/low/opening/close prices are available.

The formula is:

(X(PH)-X(PL)+X-X(PO))/((X(PH)-X(PL))*2)

The oscillator value varies between 0 and 1 and is a measurement of sensitivity. A value above 0.8 is generally regarded as
bullish and below 0.2 as bearish.

PEG ratio

The basic formula is

Price/Earnings Ratio / Annualised Earnings Growth (%)

e.g. if P/E = 20 and annualised earnings growth = 16 % then PEG Ratio = 1.25

A lower ratio is "better" (cheaper) and a higher ratio is "worse" (expensive). The P/E ratio used in the calculation may be
projected or trailing, and the annualised earnings growth rate may also be projected or trailing. This latter variable is normally
measured over a range of either 1 or 5 years.

PEG is a widely employed indicator of a stock's possible true value. Similar to P/E ratios, a lower PEG Ratio indicates that the
stock may be undervalued. It is sometimes preferred over the standard price/earnings ratio because it also accounts for growth.

A PEG ratio of 1 is sometimes said to represent a fair trade-off between the values of cost and the values of growth, indicating
that a stock is reasonably valued given the expected growth.

435E "PEG RATIO" (using FY1 and FY2 consensus I/B/E/S estimates of EPS)

X/(X(EPS2)-X(EPS1))/100

436E "PEG RATIO" (using 5 year historical growth of Worldscope earnings per share item WC05201)

X(PE)/((POW#(X(WC05201)/LAG#(X(WC05201),5Y),0.2)-1)*100)

460E "PEG RATIO HISTORIC" (using 5 year growth of historical EPS)


X(PE)/GRFL#(X(EPS),5Y)

Performance relative to benchmark

Performance is normally calculated as the % change, using the PCH# function. To calculate performance relative to a particular
benchmark, the following expression can be used - in this case a rolling 1 month performance relative to the S&P 500 index, for
example:

PCH#(X/S&PCOMP,1M)

Alternatively, rather than directly specifying an index, you can choose to automatically select the Local Index for the stock,
depending on the market it trades within:

PCH#(X/X(LI),1M)

Some users prefer to use differences instead of ratios: PCH#(X,1M)-PCH#(X(LI),1M).

To identify the index being referenced by LI for a particular stock, use datatypes LI#NAME and/or LI#MNEM in a Static Request
to return the name and mnemonic of the index, respectively.

Other benchmarks can also be automatically referenced, via a set of similar datatypes instead of LI:

MI Datastream Market Index

FII Datastream Industry Index

FEI Datastream Super Sector Index

FSI Datastream Sector Index

SEI Datastream Subsector Index

DSI Datastream Sector (level 6) index

TRI1L Thomson Reuters Economic Sector Index

TRI2L Thomson Reuters Business Sector Index

TRI3L Thomson Reuters Industry Group Index

TRI4L Thomson Reuters Industry Index

R square

Also known as the Coefficient of Determination, R2 (R squared) is a statistic often used as a measure of fit of a linear
regression. It is popularly described as the proportion of the variation in the dependent variable (around its mean value) that can
be explained by the linear regression.

In a bivariate regression, the statistic is calculated as the square of the correlation coefficient and it ranges from 0 to 1. A value
close to one indicates that data fit the calculated model closely; a value close to zero indicates that data do not fit the model
well.

For example, when the 30 day rolling correlation between a stock and its benchmark is calculated as:

CORR#(PCH#(FTALLSH,1D),PCH#(MKS,1D),30D)

Then R2 is calculated as:

POW#(CORR#(PCH#(FTALLSH,1D),PCH#(MKS,1D),30D),2)
Rebasing

Rebasing adjusts time series values to begin at a new starting value, typically 100.

The REB# function rebases an individual series; the REBE# function rebases expressions. For more details and examples of
these functions, see the Rebase page within the "Display" folder of the Function help and definitions pages.

For example:

REB#(VOD,-1M) rebases the Vodafone price to 100, starting on a date 1 month in the past.

REBE#(VOD/FTSE100,-1M) similarly rebases Vodafone performance relative to the FTSE 100 benchmark.

To “rebase” a series to zero rather than 100 to calculate performance, use the PCHV# function instead - see the "Growth rates
& changes in value" section of the Function help and definitions pages mentioned above.

For example:

PCHV#(VOD)

rebases Vodafone to zero, starting on the earliest date displayed in a chart or time series request.

To rebase a series to a starting value of your choice, use the VAL# function (see the "Display" folder of the Functions help and
definitions). For example, to rebase Vodafone to 50 on a date 3 months in the past, use the expression:

VOD/VAL#(VOD,-3M)*50

Also note that Datastream Charting contains Rebase settings which do not require the use of functions such as the above - click
the Left Axis on a chart to open the settings, and choose from "To First", "To 100" or "To Zero".

Regression standard error and t-value

The beta coefficient can be estimated using global expression 897E for example (see the Beta entry in this document).

897E "DS HISTORICAL BETA LOCAL INDEX"

REGB#(LN#(X(LI)/LAG#(X(LI),1M)),LN#(X/LAG#(X,1M)),60M)

The following expressions for standard deviation and t-value assume that there are no missing values among the 60 monthly
observations. The standard error of the REGB coefficient is then given by the formula:

POW#((1-POW#(841E(X),2))*MOM#(LN#(X/LAG#(X,1M)),60M,2)*60/58,0.5)

divided by:

SDN1#(LN#(X(LI)/LAG#(X(LI),1M)),60M)*POW#(59,0.5)

The POW# factor at the end of the second part of the expression above is the square root of the number of observations (60)
minus 1.

The overall calculation is available as global expression 096E "STANDARD ERROR OF REGB".

The t value is the REGB value divided by the standard error:

097E "T VALUE OF REGB COEFFICIENT"

841E(X)/POW#(1-POW#(841E(X),2),0.5)*POW#(58,0.5)
Relative strength index

A technical momentum indicator that compares the magnitude of recent gains to recent losses, in an attempt to determine
overbought and oversold conditions of an asset. It is calculated using the following formula:

RSI = 100 - 100 /(1 + RS)

Where RS = average of N days' up closes / average of N days' down closes. N is specified as a time parameter in the RSI#
function, e.g. N = 14D. See the "Technical" folder of the Function definition help pages for more information on this function.

Using the FTSE 100 index as an example, over a 14 day time period:

RSI#(FTSE100,14D)

The RSI ranges from 0 to 100. An asset is deemed to be overbought once the RSI approaches the 70 level, meaning that it may
be getting overvalued and is a good candidate for a pullback. Likewise, if the RSI approaches 30, it is an indication that the
asset may be getting oversold and therefore likely to become undervalued.

A further example is available in Datastream Charting - see the following sections of the Library:

Charts > Datastream > Sample Charts > Technical Charts

and

Templates > Datastream > Equities > Technical

Similarly, a template is available in the Chart Studies app in EIKON.

Revisions ratio I/B/E/S

This represents the net revisions made by brokers contributing to I/B/E/S consensus estimates.

There are a number of global expressions available within the following section of the Expression Picker:

Datastream > Ratios > Estimates Ratios

For example:

138E "REVISIONS RATIO (STOCK # ESTs)"

((X(F1UP)-X(F1DN))/X(F1NE))*100

429E "REVISIONS RATIO (INDEX # ESTs)"

((X(A12UPE)-X(A12DNE))/X(A12NE))*100

469E "REVISIONS RATIO (INDEX # COs)"

((X(A12UPC)-X(A12DNC))/X(A12NC))*100

497E "REVISIONS RATIO (INDEX E U/D)"

DPL#(X(A12UPE)/X(A12DNE),2)
Risk premium

The equity risk premium is defined as the total annualised return on equities minus the total annualised return on a safe
government backed instrument such as treasury bills or government bonds (generally short term). Normally a long time period is
used e.g. the last 10 years

There are 4 Datastream global expressions available to measure the risk premium as follows. They use the GRFL# function,
which calculates the annualised rate of growth.

180E "RISK PREMIUM UK MEASURE"

GRFL#(TOTMKUK(RI),10Y)-GRFL#(AUKGVG1(RI),10Y)

181E "RISK PREMIUM US MEASURE"

GRFL#(TOTMKUS(RI),10Y)-GRFL#(AUSGVG1(RI),10Y)

182E "RISK PREMIUM JAPAN MEASURE"

GRFL#(TOTMKJP(RI),10Y)-GRFL#(AJPGVG1(RI),10Y)

183E "RISK PREMIUM GERMAN MEASURE"

GRFL#(TOTMKBD(RI),10Y)-GRFL#(ABDGVG1(RI),10Y)

Equity risk premiums for other markets can be calculated by using the appropriate equity total return index and bond total return
index. Some users prefer to use short term government treasury bill yields, or similar, instead of government bond redemption
yields.

NB: The equity risk premium should generally be a positive number, as it represents the extra risk of holding equities compared
with 'safe' government bonds, which are conventionally considered to be 'risk free' instruments. However if a time period is
chosen which includes a significant bear market period(s), for example 1999 to 2008, the reading will almost certainly be
negative. If a measure of what is to be expected or was expected is wanted it is suggested that in general measurements
should be taken from a relatively low point to a relatively high point, in order to represent 'normal' market conditions.

Sharpe ratio

This is the total return divided by volatility, i.e. the rate of return per unit of risk. The theory is that in general, in order to obtain a
higher return you have to take on additional risk (i.e. extra volatility).

So the higher the Sharpe ratio, the better - i.e. you get a higher return for a given level of risk.

A number of global expressions are available here. A popular expression is 445E which is based on a 3 year period using
weekly data:

445E "SHARPE RATIO, 3 YEARS WKLY"

GRFL#(X(RI),-156W,)/(SDN1#(PCH#(X(RI),1W),156W)*7.211)

Here the numerator is the annualised total return, and the denominator is annualised volatility

The generalised form of the ratio is:

330E "SHARPE RATIO, GENERAL"

MAV#(PCH#(X,A)-PCH#(Y,A),Z))/SDN1#(PCH#(X,A)-PCH#(Y,A),Z)

where X is the stock/instrument (e.g. @MSFT), Y is a benchmark (e.g. S&PCOMP), A is the return measurement interval (e.g.
1D) and Z the time period for the moving average (e.g. 3M). This example be entered in a Time Series request or chart as:
330E(@MSFT,S&PCOMP,3M,1D)

Shiller PE ratio or CAPE

Named after the economist Robert Shiller, the Shiller PE - also know as Cyclically Adjusted Price Earnings (CAPE) - is an
extension of the Graham & Dodd PE - the difference being that Shiller PE adjusts for inflation.

Shiller PE traditionally relates specifically to the S&P 500 index for the US market - historical values are available back to 1950
under mnemonic USSPRPER. The measure can also be calculated and/or adapted for other markets however, using
Datastream expressions.

For the US market, mnemonics USCONPRCF (US CPI) and S&PCOMP are used in the Datastream global expressions 532E
and 533E below. At the time of writing, freely available S&PCOMP(PE) data coverage ends in October 2012 due to S&P
licencing restrictions, which impacts expression 532E. However the Datastream calculated datatype DSPE is available as an
alternative to the PE datatype, and this is used as an alternative in 533E, since data are available up to present day.

532E "SHILLER PE RATIO"

(S&PCOMP/USCONPRCF)/(MAV#((S&PCOMP/S&PCOMP(PE))/USCONPRCF,10Y))

533E "SHILLER PE RATIO (DSPE)"

(S&PCOMP/USCONPRCF)/(MAV#((S&PCOMP/S&PCOMP(DSPE))/USCONPRCF,10Y)

You can adapt these standard expressions for other markets, using the relevant market index and CPI series.

For example, for the European Union region you could replace USCONPRCF with an alternative such as EHESAW52F (HICP
ALL-ITEMS), sourced from Eurostat, and DJEURST(PE) i.e. the EURO STOXX index PE as the equity market measure:

(DJEURST/EHESAW52F)/(MAV#((DJEURST/DJEURST(PE))/EHESAW52F,10Y,I))

Tobin’s Q

Named after the economist James Tobin, this indicator is based on his hypothesis that the total value of a company should be
approximately equal to its replacement costs. The Q ratio is calculated as the market (or enterprise) value of a company divided
by the replacement value of its assets.

A low Q (between 0 and 1) indicates that the company's cost to replace assets is greater than the value of its stock; this implies
that the stock is undervalued. Conversely, a high Q (greater than 1) indicates that the company's stock is more expensive than
the replacement cost of its assets; this in turn implies that the stock is overvalued. This measure of stock valuation is the driving
factor behind investment decisions in Tobin's model.

There are a number of variations in the specific calculation of company value and asset replacement value in the financial
literature; below are two possible approaches which are available as global expressions:

168E "TOBIN'S Q"

(X(MVC)*1000.000+PAD#(X(WC03451)~PCUR,C)+PAD#(X(WC03251)~PCUR,C)+PAD#(X(WC03051)~PCUR,C))/PAD#(X(W
C02999)~PCUR,C)

170E "TOBIN'S Q VER. 2"

X(DWEV)/X(DWTA)*1.000

Turaku index

This is a market breadth indicator, calculated as the average number of gainers (advances) over a 25 day period, divided by the
average number of losers (declines) over the same period, multiplied by 100.
The Toraku index specifically refers to stocks on the Tokyo Stock Exchange, but the same methodology could be applied for
other markets too, by replacing TOKOYSE with an alternative index mnemonic (dependent on the RS and FS datatypes being
available - if they are not, a User Created Index could be calculated instead).

(MAV#(TOKYOSE(RS),25D,R)/MAV#(TOKYOSE(FS),25D,R))*100.00

Also see the related entry on the Advance Decline Ratio and Line.

Volatility

Volatility is a statistical measure defined as the annualised standard deviation of log price changes, and measures the range of
variation over time.

There are several global expressions available on Datastream for commonly used time periods and frequencies - they can be
found in the following section of the Expression Picker:

Datastream > Risk and Volatility > Volatility

These include the below expressions, for example:

840E "HISTORICAL VOLATILITY 30 DAYS"

POW#(250,0.5)*SDN#(LN#(X/LAG#(X,1D)),30D)*1.000

431E "HISTORICAL VOLATILITY 1 YEAR"

POW#(250,0.5)*SDN#(LN#(X/LAG#(X,1D)),250D)*1.0000

191E "HISTORICAL VOLATILITY 3 YEARS"

POW#(52,0.5)*SDN#(LN#(X/LAG#(X,1W)),156W)*1.0000

400E "HISTORICAL VOLATILITY 5 YEARS"

POW#(12,0.5)*SDN#(LN#(X/LAG#(X,1M)),60M)*1.0000

Note that for longer time periods, the frequency of observations is usually dropped - so in the above expressions, the 30D (day)
time period for the volatility calculation uses daily frequency price changes (1D); the 3 year time period (156 weeks) uses
weekly frequency price changes (1W); and the 5 year time period (60 months) uses monthly frequency price changes (1M).

Each of the expressions uses the POW# (Power) function to calculate the annualising factor, which for volatility is the square
root of the number of observations in a year. Daily frequency expressions use the annualising factor POW#(250,0.5) as there
are assumed to be 250 trading days in a year; weekly frequency values use POW#(52,0.5) as there are 52 weeks in a year, and
monthly frequency values use POW#(12,0.5) as there are 12 months in a year.

The SDN# is the function for measuring standard deviation; LN# is the Natural Log function used to calculate the price change
between the latest close (X) and the previous (i.e. lagged) value via LAG#.

Frequency and time periods can be varied according to your preferences within the general format of the global expressions
above. For example, to measure volatility over a 100 day period, 840E could be altered to:

POW#(250,0.5)*SDN#(LN#(X/LAG#(X,1D)),100D)

Using one of the above expressions with any financial instrument, the resulting data represents the annualised 1 standard
deviation range of the movement. For example, a volatility reading of 0.12 means that the 1 standard deviation range is +/-
12%.

Note: in the above examples, the function SDN# is used, known as the ‘population’ standard deviation. An alternative is the
function SDN1#, known as the ‘sample’ standard deviation, and this gives a very slightly larger answer because the number of
observations divisor is n-1 instead of n observations.
Also see the related entry for Implied Volatility.

Volume weighted average price (WVAP)

Volume Weighted Average Price (VWAP) is calculated by dividing the total volume of shares traded for a stock on a particular
day, into the total value of shares traded for the stock on that day. It is available under equity datatype VWAP.

The calculation uses data from the Datastream primary exchange for the stock – see the EXNAME equity datatype to check
which exchange this is.

VWAP = Sum of (Volume * Price) / Sum of Volume

In the above formula, "sum" refers to the aggregate of the day's transactions on the primary exchange.

Note that for Italian stocks VWAP represents the Prezzo Ufficiale price.

Weighted average cost of capital (WACC)

WACC, expressed as a percentage, is a calculation of a company’s cost of capital, that weights each category of capital (equity
and debt) proportionately. Investors interpret it as the rate of interest that the company must pay on its overall borrowings; the
company can use the figure as an opportunity cost estimate in determining whether future projects are worthwhile undertaking.

The calculation requires a number of inputs, defined as:

Stock: X

Risk Premium: Y

Risk Free Rate: Z

Corporate Tax Rate: A

Beta: e.g. 897E

Total Debt: PAD#(X(WC03255),C)

Market Value of Equity: X(MV)~ACUR

Then:

557E "WACC"

((897E(X)*Y+Z)*X(MV)~ACUR+(1-A/100)*(Z+1)*PAD#(X(WC03255),C)/1000)/(X(MV)~ACUR+PAD#(X(WC03255),C)/1000)

Note: the first part of this formula is the cost of raising equity, which is essentially the risk free interest rate plus the equity risk
premium (defined as market risk premium multiplied by Beta - see the entry for Beta above; the second part of the formula is the
cost of borrowing money, where the interest rate for a blue chip company is here assumed to be 1% above the risk free rate (i.e.
Z+1), and interest payments are tax deductible - for lower rated companies, a figure higher than 1 should be used accordingly to
reflect a higher cost of borrowing. The two parts are weighted by their respective market values of equity and debt.

X,Y,Z,A can be selected as for example:

X = GSK (GlaxoSmithKline)

Y = 180E for UK risk premium - see the entry on Risk premium above for more explanation.

Z = BMUK10Y(RY) for the UK 10 year government bond yield as the risk free rate.

A = 20% UK corporate tax rate as of 2015.


And the expression is then used in this case as:

557E(GSK,180E,BMUK10Y(RY),20)

Williams %R

Williams %R is a momentum indicator that works much like the Stochastic Oscillator, see entry above. It is especially popular for
measuring overbought and oversold levels. The scale ranges from 0 to -100, with readings from 0 to -20 considered overbought,
and readings from -80 to -100 considered oversold.

William %R, sometimes referred to as %R, shows the relationship of the close relative to the high-low range over a set period of
time. The nearer the close is to the top of the range, the nearer to zero (higher) the indicator will be. The nearer the close is to
the bottom of the range, the nearer to -100 (lower) the indicator will be. If the close equals the high of the high-low range, then
the indicator will show 0 (the highest reading). If the close equals the low of the high-low range, then the result will be -100 (the
lowest reading).

Typically, Williams %R is calculated using 14 periods and can be used on intraday, daily, weekly or monthly data. The time
frame and number of periods will likely vary according to desired sensitivity and the characteristics of the individual security.

531E "WILLIAMS % R 14 DAYS"

(MAX#(X(PH),14D)-X(P))/(MAX#(X(PH),14D)-MIN#(X(PL),14D))*(-100)

Williams variable accumulation

This indicator is calculated as the cumulative sum over time of the formula:

Williams' Variable Accumulation Distribution provides a measure of volume, weighted by price momentum. It calculates the
current period’s volume, multiplied by the difference between the close and open, divided by the period's range. A close above
the open will result in values greater than zero. A close below the open will result in values less than zero. If the close and open
are always within the high/low range, then Williams' Variable Accumulation Distribution will range in value between the current
period's volume and the negative of the current period's volume.

507E "WILLIAMS VARIABLE ACCUMULATION"

CSUM#((X-X(PO))/(X(PH)-X(PL))*X(VO))

Z-score (Altman)

This is a ratio devised by Robert Altman to describe the financial health of a company, and its likelihood of financial distress.
Altman Z-Score works best with manufacturing companies; it is not so well suited to financial or property companies.

Given:

TA - Total Assets

TL - Total Liabilities

WC - Working Capital (i.e. stocks and debtors less short term creditors)
RE - Accumulated Retained Earnings (the profit and loss account reserve)

EBIT - Earnings Before Interest and Tax

S - Sales

MV - Market Value of equity

And:

X1 = WC/TA : This measures liquid assets in relation to the firm's size. Altman, interestingly, mentions that the most widely used
current and acid ratios were not as good predictors as this measure.

X2 = RE/TA : This is a measure of cumulative profitability that reflects the firm's age as well as earning power. Many studies
have shown failure rates to be closely related to the age of the business.

X3 = EBIT/TA : This is a measure of operating efficiency separated from any leverage effects. It recognizes operating earnings
as a key to long-run viability.

X4 = MC/TL : This ratio adds a market dimension. Academic studies of stock markets suggest that security price changes may
foreshadow upcoming problems.

X5 = S/TA : This is a standard turnover measure. Unfortunately, it varies greatly from one industry to another.

Then the Z-score for manufacturing companies is defined as:

Z = (1.2 * X1) + (1.4 * X2) + (3.3 * X3) + ( 0.6 * X4) + ( 1.0 * X5)

Z < 1.8 indicates a bankruptcy candidate, Z > 3 indicates strong health.

For non-manufacturing companies, X5 varies significantly by industry. It is likely to be higher for merchandising and service
firms than for manufacturers, since the former are typically less capital intensive. Consequently, non-manufacturers would have
significantly higher asset turnover and Z scores. The model is thus likely to underpredict certain sorts of bankruptcy. To correct
for this potential defect, Altman recommends the following formula:

Z = (6.56 * X1) + (3.26 * X2) + (6.72 * X3) + (1.05 * X4)

Z < 1.1 indicates a bankruptcy candidate, Z > 2.6 indicates financial health.

Two global expressions are available in the Expression Picker, folder:

Datastream > Fundamentals > Altman Z-Score:

374E "ALTMAN Z-SCORE MANUFACTURING"

HIDE#(PAD#((X(WC03151)*1.2+X(WC03495)*1.4+X(WC18191)*3.3+X(WC01001))/(X(WC02999)-
X(WC02649))+X(WC08001)/X(WC03351)*0.6,C))

375E "ALTMAN Z-SCORE NON-MANUFACTURING"

HIDE#(PAD#((X(WC03151)*6.56+X(WC03495)*3.26+X(WC18191)*6.72)/(X(WC02999)-
X(WC02649))+X(WC03501)/X(WC03351)*1.05,C))

Note -

Total Tangible Assets are used: WC02999 (Total Assets) less WC02649 (Total Intangible Other Assets Net).

Both forms of the Altman Z Score are available in Datastream Charting - see the following sections of the Library:

Templates > Datastream > Fundamentals > Altman Z-Score


Similarly, a template is available in the Chart Studies app in EIKON.

Z-score (Statistical)

This is a statistical measurement used for normalizing data. A Z-score of 0 indicates the value is equal to the mean value over
the given time period; a Z-score that is positive or negative indicates that the value is above or below the mean (respectively)
and by how many standard deviations.

A generalised global expression is available, for calculating the Z-Score over a rolling time period:

300E "ROLLING NORMALIZATION"

(X-MAV#(X,Y))/SDN1#(X,Y)

Where:

X = any time series instrument, e.g. FTSE100

Y = time period, e.g. 30D (30 days)

This example would be entered in a Time Series request or chart as:

300E(FTSE100,30D)

Alternatively, the following can be used, which uses the start and end date of the Time Series request or chart as the respective
start and end points of the Average and (sample) Standard Deviation components of the Z-Score calculation:

315E "NORMALIZATION"

(X-AVG#(X))/SDN1#(X)

For example:

315E(FTSE100)
Other helpful hints for Datastream expressions and functions
Percentage of stocks trading above/below a high/low or moving average

A common question is around how to calculate the percentage of stocks within an index that are trading above/below a moving
average, or above/below a 52 week high/low, for example.

The LIST# function can be used to achieve this.

The following calculates the % of stocks within the Nikkei 225 index* that hit a new 52 week high each day:

LIST#(LJAPDOWA,(IF#(MAX#(LJAPDOWA,52W)-LAG#(MAX#(LJAPDOWA,52W),1D),GT,ZERO))*100,AVG)

Conversely, the following calculates the % that hit a new 52 week low each day:

LIST#(LJAPDOWA,(IF#(MIN#(LJAPDOWA,52W)-LAG#(MIN#(LJAPDOWA,52W),1D),LT,ZERO))*100,AVG)

The LJAPDOWA list and 52W (week) time period could be switched to alternatives if preferred.

Similarly, the same type of expression can be used to calculate the % of constituents trading above the 50 day moving average,
for example:

LIST#(LJAPDOWA,(IF#(LJAPDOWA-MAV#(LJAPDOWA,50D),GT,ZERO))*100,AVG)

The following example chart is available in the Datastream Charting Library - in the following folder:

Chart Functionality Examples > Analytics > Functions and Expressions

*Note - the constituent list mnemonic used in the LIST# expression is fixed and does not allow for index constituents changing
over time. In this example, LJAPDOWA is the latest (as of the date of the request) Nikkei 225 constituent list - and this is used
for the calculation of all time series values, so caution should be exercised if requesting a long time series of values, spanning
more than a year for example. Historical constituent lists are generally available for indices however, by suffixing the month
(end) and year to the mnemonic - e.g. suffixing 1214 to LJAPDOWA (i.e. LJAPDOWA1214) specifies the December 2014 month
end list. So the LIST# expression could be run in time period sections, using the respective historical consituent list for each
section, if greater accuracy is required.
Some of the IF# expressions used in the expressions above contains the operator LT. Missing values are treated as small
negative values so the IF# equals 1 for missing values. If that behaviour isn’t wanted the expression should be reformulated to
use GT instead.

Dummy time series for use in expressions

A wide variety of 'dummy' time series are available, which are often very useful as components of calculations when
constructing complex and/or conditional expressions.

For example, they could be used in the following types of application:

• USCONPRCF/DUMMYMQ3 displays only the third month of each quarter for the US consumer price index.

• UKCONPRCF/DUMMYJAN displays only the January figures for the UK consumer price index.

• Growth rates can be produced by using the exponential time trends, for example:

• POW#(UKTIMEME,0.4066) displays a monthly series which grows at an annual rate of 5%.

• The factor 0.4066 in the above case is determined by the expression LN#(1+growth rate/100)/(number of periods in
year/100), so LN#(1.05)/0.12 = 0.4066

• Similarly POW#(UKTIMEQE,2.383) is a quarterly series growing at 10% annual rate, where LN#(1.1)/0.04 = 2.383

• Negative time trends:

• (588-UKTIMETM) is a monthly (linear) time trend starting at 587 in January 1951, counting down to zero in December 1999.

• 7.099327/UKTIMEQE starts at 7.099327 in the first quarter of 1951 and declines (exponentially) by EXP 0.01 each quarter to 1 in
the fourth quarter of 1999.

• POW#(7.09933/UKTIMEQE,1.814) displays a quarterly series which declines at an annual rate of 7%

The factor 1.814 = LN#(100/(100-7))/0.04 i.e. 7 is the negative growth rate and 0.04 is the number of time periods in year
divided by 100

The following trend time series are available:

CALNDAY increases by 1 each calendar day (starting 1 January 1990)

DUMMYDY increases by 1 each weekday (starting 1 January 1970)

UKTIMETM increases by 1 each month (starting January 1951)

UKTIMETQ increases by 1 each quarter (starting Q1 1951)

UKTIMETY increases by 1 each year (starting 1951)

UKTIMEME increases exponentially by EXP0.01 each month (starting January 1956)

UKTIMEQE increases exponentially by EXP0.01 each quarter (starting Q1 1951)

UKTIMEYE increases exponentially by EXP0.01 each year (starting 1956)

Annual frequency series

DUMMYYRE yearly end alignment series value = 1 (starting 1950)

DUMMYYRM yearly mid alignment series value = 1 (starting 1950)


Quarterly frequency series

DUMMYQTE quarterly end alignment series value = 1 (starting 1950)

DUMMYQTM quarterly mid alignment series value =1 (starting 1950)

DUMMYQT1 quarterly series containing value 1 in each 1st quarter; starting Q1 1951

DUMMYQT2 quarterly series containing value 1 in each 2nd quarter; starting Q2 1951

DUMMYQT3 quarterly series containing value 1 in each 3rd quarter; starting Q3 1951

DUMMYQT4 quarterly series containing value 1 in each 4th quarter; starting Q1 1951

DUMMYQTR2 quarterly series containing value 1 in each 2nd quarter; starting Q1 1950

DUMMYQTR3 quarterly series containing value 1 in each 3rd quarter

DUMMYQTR4 quarterly series containing value 1 in each 4th quarter

Monthly frequency series

DUMMYCAL monthly series showing number of calendar days in the month

DUMMYNAS monthly series where every value = 0

MONTHLY2 monthly series where value = 1 for January, 2 for February, 3 for March etc.

DUMMYMTE monthly end alignment series where every value = 1

DUMMYMTM monthly mid alignment series where every value = 1

DUMMYJAN monthly series where January =1, zero otherwise

Similarly: DUMMYFEB DUMMYMAR DUMMYAPR DUMMYMAY DUMMYJUN DUMMYJUL DUMMYAUG DUMMYSEP


DUMMYOCT DUMMYNOV DUMMYDEC

DUMMYMQ1 first month of each quarter=1, zero otherwise

DUMMYMQ2 second month of each quarter=1, zero otherwise

DUMMYMQ3 third month of each quarter=1, zero otherwise

DUMMYMQ1N monthly series where January/April/July/October=1, n/a otherwise

DUMMYMQ2N monthly series where February/May/August/November=1, n/a otherwise

DUMMYMQ3N monthly series where March/June/September/December=1, n/a otherwise

Daily frequency series

DUMMY01 daily series where value=1 at end of month, zero otherwise, starting 26 December 1986

DUMMYDD daily series where value=1 every day, starting 28 January 1980

DUMMYMD daily series where value=1 on Mondays, starting January 1950

DUMMYTU daily series where value=1 on Tuesdays, starting January 1950

DUMMYWD daily series where value=1 on Wednesdays, starting January 1950


DUMMYTH daily series where value=1 on Thursdays, starting January 1950

DUMMYFR daily series where value=1 on Fridays, starting January 1950

DUMMDY1 daily series which counts up from 1 and resets each year, starting 1 January

Holidays

A list mnemonic HOLIDAYS exists, which contains a list of stored series codes that can be used to indicate the dates that
market holidays* fell on within each respective market, when used in a Time Series request. Non-holiday dates are returned as
zero; holidays are returned as null values. So for example, the code C:VACS shows which dates were market holidays in
Canada, when downloaded in a Time Series request.

To retrieve the list of Holiday series for each market, enter HOLIDAYS as the series in a Static request, along with datatypes
MNEM and GEOGN to return the mnemonic and country name respectively.

This is useful in particular when analysing history for instruments that trade infrequently - and you are looking to distinguish
between dates when they didn't trade because of a market holiday, versus dates when the market was open but the stock still
didn't trade.

You can also incorporate these Holiday series into expressions - for example the following uses CMS# (Calendar Month Sum)
and IF# to count the number of Canadian holidays that occured during each month.

CMS#(IF#(C:VACS,NNA,ZERO))

For further explanation and examples of using the CMS# and IF# functions, see the Calendar and Logical folders respectively of
the Functions Help and Definitions pages.

*Defined as dates on which the primary stock exchange was closed, due to public holiday or unusual circumstances such as
temporary trading suspensions or extreme weather events.

IF# function and conditional tests

This function is used to test a specific condition within a series or expression. It returns a True/ False value.
The format is:

IF#(Series or expression, operator,return value)

Where the operators are:

EQ Equal zero

NE Not equal zero

GT Greater than zero

GE Greater or equal zero

LT Less than zero

LE Less than or equal zero

NA Equal to NA

NNA Not equal to NA

And the return value is:

ZERO true=1, false=0

ONE true=1, false=-1

In order to achieve more flexibility with the output from these conditional tests, it is possible to combine a number of IF#
expressions together.

For example, the following expression returns -1 when EPS is zero, and actual EPS otherwise:

IF#(X(EPS),NE,ONE)*X(EPS)+(IF#(X(EPS),EQ,ZERO)*IF#(X(EPS),NE,ONE))

Note that missing values are treated as small negative values so the following always returns 1 (for true) if X is missing value
and Y is any value

IF#(X-Y,LT,ZERO)

Index calculation from percentage change series

There is the global expression 099E = 100+(EXP#(CSUM#(LN#(1+X/100.00)))-1)*100. For example 099E(BARC(MSGRD)).

That means that the index start at 100 at one observation prior to the request stat date. An alternative expression which starts at
100 on the request stat date is

REBE#(CPRD#(1+X/100)). For example: REBE#(CPRD#(1+(BARC(MSGRD)/100)))

Inflation (removal of)

To remove ('deflate') the effect of inflation from a series, for example to remove the effect of US inflation from the Dow Jones
Industrials index (DJINDUS), you could use the expression:

(DJINDUS/CSR#(USCONPRCF))*VAL#(CSR#(USCONPRCF),VLO)

This uses the USCONPRCF series for US CPI, with the CSR# (Continuous Series) function to transform the monthly CPI series
into daily frequency values, and the VAL#(...,VLO) function to select the CPI value at the start date of the chart, in order to
perform the rebase of values.
This is available generically as a global expression:

171E "INFLATION ADJUSTMENT"

(X/CSR#(Y))*VAL#(CSR#(Y),VLO)

171E(DJINDUS,USCONPRCF) is then equivalent to the above example.

Limits for expressions and AFO/DFO requests

Expression lengths using direct entry (i.e. when not using a saved, short expression code)

In DFO you can type in an expression with a maximum length of 100 characters (time series request) or 45 characters (static
request).

In Datastream Charting you can type in expressions with a maximum of 100 characters, except for Static Scatter charts, which
only accept 45 characters.

In Datastream Advance/AFO you can type in an expression with a maximum length of 78 characters (time series request) or 45
characters (static request).

DFO and Datastream Charting expression builder

An expression saved as a short code (E###) can contain a maximum of 160 characters.
Tips:

• You can combine expressions to get around the character limits e.g. expression C = expression A + expression B, each of which could
have up to 160 characters.
• You can use up to four symbolic variables (X,Y,Z,A) in your saved expression to make it shorter.
• Note:
• When you save the expression, it is validated and if it contains symbolic variables, then a value has to be supplied for those variables
and the expression is expanded to its full length. For any single expression this cannot exceed 160 characters.
• Expressions which contain other expression codes, and/or use symbolic variables, will get expanded to their full length when the request
is processed by the central system. When the expression is expanded, it will be bounded by brackets to ensure mathematical
compliance. In practice this means that an expression stored with 160 characters will be expanded as 162 characters.
• The maximum expanded length (including any combined or nested expressions and any series codes/mnemonics replacing symbolic
variables) is 400 characters.

Datastream Advance/AFO expression builder

As above, except the following differences:

• An expression saved as a short code (E###) can contain a maximum of only 78 characters.

• If it contains symbolic variables, then the fully expanded version (with sample series mnemonics replacing the symbolic variables) for
any single expression cannot exceed 100 characters.

• When the fully expanded expression is bounded by brackets to ensure mathematical compliance, an expression stored with 78
characters will be expanded as 80 characters.

Other expression limits

A single expression can contain a maximum of:

• 20 functions
• 19 operators (20 operands) e.g. X+Y contains one operator (+) and two operands (X and Y)

Numeric constants can have a maximum of 8 digits.

Expressions containing currency conversion to Japanese Yen via the tilde function must use JY or JPY instead of Y otherwise it
is considered as a symbolic variable.

When saving a new expression, the following characters are permitted in the name of the expression:

ABCDEFGHIJKLMNOPQRSTUVWXYZ

abcdefghijklmnopqrstuvwxyz

0123456789

$.<(+&!£*)-/%_>?:#@=\' ,^"

Other DFO and AFO limits

In DFO requests, the character limit for a formula is 8192.

In AFO requests, the character limit for each field is 256.

When uploading lists (L#, TR#) in either AFO or DFO, there is a limit of 5000 instruments.

When uploading a User Created Time Series (UCTS) in AFO/DFO, there is a limit of 7400 datapoints.

When uploading a User Created Index (UCI), there is a limit of 2000 instruments. However, it is possible to create a UCI using
other UCI codes as constituents - e.g. X%UCI001 and X%UCI002 could be created containing 2000 instruments each, then
X%UCI003 could be created using X%UCI001 and X%UCI002 as its constituents. Note - when using this method, they must all
be saved on an Auto calculation basis, not Trial.

In DFO, there are limits that come into play due to the size of the array being returned by a single request – e.g. a Time Series
for List request for 20 years of daily values and a list of 2000 series would return 1.3 million values, and this may cause memory
exceptions (this will depend on the version of Office and specification of the PC) – the recommendation is that no more than
300,000 values should be requested in a single request. There is also a timeout of 4 minutes for a single request that may be
encountered if several datatypes are requested in a Static request for a large list. In these cases, requests should be broken
down to return smaller sets of values.

LIST# expressions

The LIST# function enables you to perform analysis on lists of series, using operators to count or calculate average, sum,
maximum, minimum or standard deviation.

The basic format is:

LIST#(List mnemonic,Expression,List-operator)

Where the List-operator must be one of the following:

• CT count of series in list


• CTGT count of series in list where the value of the expression is greater than zero
• CTGE count of series in list where the value of the expression is greater then or equal to zero
• CTLT count of series in list where the value of the expression is less than zero
• CTLE count of series in list where the value of the expression is less than or equal to zero
• AVG the average value of the expression for each series in the list
• SUM the sum value of the expression for each series in the list
• MIN the smallest value in the list for the expression
• MAX the largest value in the list for the expression
• SDN the 'population' standard deviation (division by n constituents)
• SDN1 the 'sample' standard deviation (division by n-1 constituents)

LIST# expressions can be combined in the following ways

LIST#(List mnemonic,Expression,List-operator) Arithmetic-operator LIST#(List mnemonic,Expression,List-operator)


Arithmetic-operator Expression

The Arithmetic-operator is one of +, -, * and /.

Such combined expressions cannot use lists containing more than 1000 series.

Examples:

LIST#(LFRCAC40,ACH#(X,1D),CTLT)
This calculates the number of fallers each day in the French CAC40 index

LIST#(LDAXINDX,X(PE),AVG)
This calculates the simple arithmetic average PE for the German DAX index

LIST#(LFTSEMIB,X(MV),SUM)
This calculates the market value for the Italian FTSE MIB index

LIST#(LFTSE100,REB#(X,1/1/03),AVG)
This recalculates the FTSE100 index to a base of 1/1/2003=100 with each equity having an equal initial weighting. With time the
weights will follow the development of the relative prices.

LIST#(LFTSE100,X-MAV#(X,200D),CTGT)
This counts the number of equities in the FTSE100 list which are above their 200 day moving average
LIST#(LFTSE100,X-MAV#(X,200D),CTGT)
This counts the number of equities in the list, which are within 10% of their (5 year) maximum value

LIST#(LFTSE100,X(PEFD12),CTGT)/LIST#(LFTSE100,111E((1/(X(PEFD12))),0),SUM)
This calculates an equally weighted 12M forward PE using only stocks having positive PEFD12.

LIST#(LFTSE100,REB#(X),AVG)-REB#(FTSE100)
This calculates an initially equally weighted price index using the latest constituents of the FTSE100 and subtracts the
benchmark price index

Note:

While you can nest any Datastream functions within the LIST# function, LIST# itself cannot be nested within any other
Datastream function. Therefore: LIST#(LFTSE100,PCH#(X,1D),AVG) is possible, but MAV#(LIST#(LFTSE100,X(PE),AVG),3M)
is not.

Unpadding values or switching off carry forward

Many time series instruments on Datastream have 'padding' applied by default, meaning that on days where no value is
available (null), due to a market holiday for example, the value from the previous day is carried over. This prevents there being a
break in the line when charting the values, for example.

If you prefer to switch off this padding behaviour so that you only see 'real' values with no padded values, it is possible to do this
by applying a suffix to the datatype being displayed:

#S as the suffix on the datatype unpads all values in the time series.

#T unpads values only after the latest real value - all values earlier than the latest real value are left padded.

In order to apply these suffixes, it is necessary to understand which datatype is being used - you may be explicitly defining a
datatype in your request, or you may be using a datatype implicitly, by default.

For example, VOD(RI) explicitly specifies the RI (Return Index) datatype for Vodafone - to switch off padding of RI values, use
either VOD(RI)*VOD(PI#S)/VOD(PI#S) or VOD(RI#T) according to preference. It is not recommended to use VOD(RI#S) since
the calculation for that datatype isn’t right. Whereas, VOD on its own does not explicitly define a datatype - here the Price
(datatype P) is implicitly used by default, and in order to switch off padding of the price, VOD(P#S) or VOD(P#T) must be used -
VOD#S is not valid syntax.

In order to check/discover which datatype is the default for a particular instrument, use the DEF (Default) datatype in a static
request - as illustrated below for three example instruments. P (Price) is the default for most equities and PI (Price Index) is the
default for most equity indices, but RI (Return Index) may be the default for other equity indices such as the Brazil Bovespa, for
instance.

Note: Conversely, if you would like to switch on padding for a time series that does not have padding applied by default, use the
PAD# function - for more information, see the "Display" folder of the Functions Help and Definitions pages.
Risk free interest rates

A risk free interest rate is the internal rate of return that can be obtained by investing in a financial instrument without (or with
very limited) credit risk.

Normally this will relate to a short term investment in a financial instrument backed by the government. These money market
securities are assumed to bear no credit risk and have a limited re-investment risk, when the investment is rolled over for
another short term period.

The (annualised) yield on 3-month treasury bills is often used for analysis involving risk free rates, or alternatively a long term
rate, such as the 10 year government bond yield, is commonly used too.

For currencies where no liquid treasury bill market exists (or the market is subject to institutional distortions), interbank rates
such as LIBOR or EURIBOR rates could also be considered. These do, however, bear a minimal credit risk inherent to the
banks active in the market.

Currencies with liquid repo-markets where the general collateral is a risk free long-term government bond offer another
alternative for an interest rate which comes closest to 'risk-free', but are not available for as many currencies as interbank rates.

Some examples of short term Risk Free Rates for various markets are:
Units, scale and currency
Converting to a different currency

Values can be converted to alternative currencies on Datastream using the following syntax:

X(DT)~CC

Where:

• X can be replaced with a specific series mnemonic (e.g. @AAPL for Apple) when using this expression in the Series/List field of a
Time Series request or in a chart; or it can be left as the symbolic variable X, when using the expression on the Datatypes/Expression
line of a request, in order to convert several series all at once.

• DT is the datatype to be converted

• CC is the currency code to convert to. ISO currency codes (such as USD, EUR and JPY for example) are accepted as currency codes,
or Datastream currency codes can equivalently be used instead.

These currency conversions use spot exchange rates sourced from WMR, fixed at 16:00 UK time each day.

For example, to convert a selection of stock prices (datatype P) to euro in a time series request, use X(P)~EUR on the
datatypes line, so that X is symbolic and is replaced by each stock mnemonic in turn:

Currency conversions can also be selected using the $ button in the request window above, and via the flyout in Datastream
Charting as below.
Decimal places

It is possible to override the default number of decimal places that values are returned at - when this is required in Excel
workflows for example.

• Use the DPL# (Decimal Place) function to increase/decrease the decimal places.

For example, the expression PCH#(VOD/FTSE100,1M) - which calculates the 1 month % change in the Vodafone price relative
to the FTSE 100 benchmark - returns values at 2 decimal places by default. If you would prefer to display values at 4 decimal
places, you can nest this within DPL# as follows:

DPL#(PCH#(VOD/FTSE100,1M),4)

Conversely, if you prefer to display values as integers, you can instead specify zero decimal places:

DPL#(PCH#(VOD/FTSE100,1M),0)

• Alternatively, you can simply multiply by 1.000 where the number of zeros after the decimal point indicates the number of
decimal places you require, in this case 3:

PCH#(VOD/FTSE100,1M)*1.000

How to determine the units/scale for a datatype

There are a number of different ways to view the units / scale of the figures you are looking at.

• In a Static request you can use the suffix #U on the datatype. As some examples, for MV (Market Value), MV#U returns
"1000000", meaning Millions, and for NOSH (Number of Shares) and WC01001 (Sales or Revenues), NOSH#U and
WC01001#U return "1000", meaning Thousands.
• Similarly, for economics series, you can use the ESUNT (Economics Series Units) and ESMAG (Economics Series
Magnitude) datatypes - in the below example, the ESMAG output of 1000000000 indicates a scale of Billions, and the
output of 1 indicates actual values.

Note - if you would prefer to display economics series in actual values, use the time series datatype ESA - for example
USGDP...D(ESA) returns actual USD values, rather than billions of USD.

• Alternatively, look at the spotlight section for the series in Navigator - for economics data in particular, the units and scale
are displayed along with other summary metadata.
Volume definitions

The precise definition of Turnover by Volume figures retrieved using the VO equity dataype vary according to individual
exchanges. The below list provides further information - all single counted volumes are assumed to be 'sell side only'.

Argentina - Single counted


Australia - Single counted
Austria - Single counted
Bangladesh - Single counted
Belgium - Single counted
Brazil - Single counted
Bulgaria - Single counted
Canada - Single counted
Chile - Single counted
China - Single counted
Colombia - Single counted
Cyprus - Single counted
Czech Republic - Single counted
Denmark - Single counted
Ecuador - Single counted
Egypt - Single counted
Estonia - Single counted
Finland - Single counted
France - Single counted. Official adjusted volumes stored the following morning.
Germany - Single counted
Greece - Single counted
Hong Kong - Single counted
Hungary - Single counted
Iceland - Data not held, hence no definition
India - Single counted
Indonesia - Single counted
Ireland - Single counted
Israel - Single counted
Italy - Single counted
Jamaica - Data not held by Datastream, hence no definition
Japan - Single counted
Jordan - Data not held, hence no definition
Kenya - Single counted
Korea - Single counted
Lithuania - Single counted
Luxembourg - The Stock Exchange does not publish volume data
Malaysia - Single counted
Mauritius - Data not held by Datastream, hence no definition
Mexico - Single counted
Morocco - Single counted
Netherlands - Single counted
New Zealand - Single counted
Nigeria - Data not held by Datastream, hence no definition
Norway - Single counted
Pakistan - Single counted
Peru - Single counted
Philippines - Single counted
Poland - Single counted
Portugal - Single counted
Russia - Single counted
Saudi Arabia - Data not held by Datastream, hence no definition
Singapore - Single counted
Slovakia - Data not held by Datastream, hence no definition
South Africa - Single counted. Also Volume by Value (VA).
Spain - Single counted
Sri Lanka - Single counted
Sweden - Single counted
Switzerland - Single counted
Taiwan - Single counted
Thailand - Single counted
Tunisia - Data not held by Datastream, hence no definition.
Turkey - Single counted
United Kingdom - VO = Single counted. The following volume-related datatypes are also single counted: AB, AN, AV, CB, CN,
CV, DB, DN, DV, JB, JN, JB
United States - Single counted
Venezuela - Single counted
Zimbabwe - Single counted
DFO and Datastream Charting workflow tips
See the DFO help for more details on the formula options

https://product.datastream.com/WebHelp/DFO/3.0/DSGRID_formula_format.htm

7 day week data retrieval in DFO

Values on Datastream are stored and retrieved on a 5 day week basis (Monday to Friday) as standard. However it is possible to
retrieve values on a 7 day week basis if preferred, using the below frequency parameters in DFO 3.0 formulas. The 7DME and
7DMEPAD options are designed for Middle East markets.

At the time of writing, these frequency parameters are not yet available to select using the menu options within the DFO Time
Series Request window, but an update to the DFO 3.0 interface to include these options is planned for 2016. In the meantime,
they are available to be used by editing DFO 3.0 formulas directly in the formula bar of the worksheet* after running a standard
5 day week request.

7D - Displays null value for Saturday and Sunday dates

7DPAD - As above for 7D, but with Friday values padded across Saturday and Sunday dates

7DME - Displays null value for Friday and Saturday dates, with Friday values moved to Sundays

7DMEPAD - As above for 7DME, but with Thursday values padded across Friday and Saturday dates

*NB - if you manually edit the formula to include one of these new frequency parameters and then subsequently edit the DFO
formula using the Edit Request button on the DFO ribbon, this setting will be overwritten. Once the new parameters are fully
introduced into the interface however, this will not apply.
AFO to DFO conversion tips & troubleshooting

If large lists (e.g. more than 2000 constituents) are used in DFO time series requests and a timeout is encountered, an
alternative method of backend retrieval can be used which should improve performance in these particular scenarios. To use
this alternative method, prefix the constituent list code with an asterisk - so for example *LMSSAWFD rather than
LMSSAWFD. Please note however that this alternative method is:

• not necessary or recommended for use in standard requests which do not timeout - indeed it may result in slower
retrieval speeds in smaller and/or less complex requests
• only applicable for static requests, not time series requests
• not applicable for the WTIDX (constituent weights) datatype

By default, DFO formulas are set to Auto-Calculate (refresh) when the Excel workbook they are contained within is opened. This
can sometimes be inconvenient when opening large files, as the user must wait for the formulas to refresh before the file can be
used. The Auto Calculation option can be suppressed however, if preferred, via the Options button on the DFO ribbon as below.

When a new workbook is opened, DFO formulas are set to Auto-Calculate (refresh) whenever any other cells or formulas within
the worksheet are edited/refreshed. This setting can be overridden via the drop down menu on the Refresh button of the DFO
ribbon.

There are some differences between AFO and DFO in the way that requests are sent to our servers from the Request Table –
AFO sends the request sequentially whereas DFO sends them in a parallel. The sequencing option switches the DFO requests
back to sequential – and this may help prevent timeouts, as there are less requests being sent simultaneously.
In DFO Request Tables, if there is no value available in the output of a request, a blank cell is displayed by default, whereas in
AFO the default null value output was "NA". This presence of blank cells rather than NA null values can sometimes lead to
#VALUES errors in Excel. It is possible to customize the null value output in DFO however, either via the Request Table setting,
or via the Options button on the DFO ribbon:

=TODAY() is an example of a volatile Excel function, meaning that its value (in this case, today's date) can change over time,
without any of its arguments changing. This can sometimes cause problems with requests looping or hanging within workbooks
containing DFO formulas, particularly in large files. We recommend using the Eikon function =RtTODAY() instead, to resolve
this problem.

If any AFO Index files containing pre- or post-conversion customized macros encounter problems with the macros not being
triggered after converting to DFO Index files, and macros have already been enabled and trusted in Excel Options > Trust
Centre, the cause may be due to the macro(s) being Digitally Signed - and removing the Digital Signature should resolve the
problem in this case. To do this, from the Developer ribbon* go to Visual Basic > Tools > Digital Signature > Remove.

*Note that if the Developer ribbon is not displayed in Excel, it can be enabled by clicking the Office button in the top left of the
Excel window > Excel Options > Popular > Show Developer tab in the ribbon.
Additional DSGRID parameters
Note – this parameters will be cleared if the Edit option is used on the Datastream ribbon

There are some additional parameters that can be added to DSGRID formulas to access additional functionality (see
http://product.datastream.com/WebHelp/DFO/3.0/DatastreamforOffice.htm for the standard set)

The Not Available String can be overridden using - NotAvailableString=N/A; NotAvailableString=Blank, etc

The number of datatypes to send to the servers in each bundle for a static request can be set, where timeouts are occurring –
where N is the number (use M:N in the Request Table Request format option) - MaxConfigSnapshotRequests=N

The output of a static request can be sorted based on different datatypes/functions using - SORTA=datatype/expression for
ascending and SORTD=datatype/expression for descending – with the following condtions -

1. Colheader = true must be provided with any of the SORT options or rowheader=true if transpose = true.
2. Maximum of 2 SORT options can be used for sorting.
3. SORTA/SORTD functionality is limited to 20 Datatypes/Expressions and 1000 series.
4. "NA" values will be at the end irrespective of the SORT being used.

For example

=@DSGRID("LFTSE100","NAME;mnem;X(FEI#NAME);PCH#(X/X(FEI),1Y);PCH#(X,1Y)","Latest
Value","","","SORTD=X(FEI#NAME);SORTD=MV;ColHeader=true","")

Date alignment for economics series

The default date alignment for an economics series is related to the characteristics of the data and how the figures are
published by the original source: series for which an average or sum is calculated when requested at a lower frequency (e.g. a
monthly series requested at quarterly frequency) normally have values aligned to the mid-period dates by default, whereas
series for which the end-period value is displayed when requested at a lower frequency (such as UKRESERVA – UK
GOVERNMENT GROSS RESERVE ASSETS in the example below) have values aligned to the end-period dates by default.
Most economics series have values aligned to the mid-period dates.

It is possible to override the default date alignment of economics series however, using the ALI# (Alignment) function. This is of
particular use in Excel worksheets – if you have set up multiple time series requests for economics series in your sheet, you
may wish to use this function to help match dates with each other.

Note: When running a time series request containing multiple economics series mnemonics, the dates returned are determined
by the series placed first in the request. So if the first series is mid-period aligned (either by default or via the ALI# function),
then the dates returned will be mid-period - regardless of the alignment characteristics of any other series placed subsequently
within the same request.

The middle of the period for monthly series is the 15th day of the month; for quarterly series it is 15
February/May/August/November; for yearly series it is 30 June.

The end of the period for monthly series is the last weekday of the month; for quarterly series it is the last weekday of the
calendar quarter; for yearly series it is 31 December.

General format: ALI#(Expression,Parameter)

Where Parameter is:

MID Values are aligned to the mid-period dates

VAL Values are aligned to the end-period dates

Date Alignment in Datastream Charting

The ALI# function can also be used in Datastream Charting, however there is an alternative feature which allows you to adjust
the alignment of a line without needing to use ALI# – this setting is accessed on the Data Alignment tab of the Line Settings
window.

This feature is particularly useful for stacked bar charts – to ensure that all bars stack on top of each other correctly when the
default alignment is not the same for all lines.

The Auto setting displays values according to the default alignment of the series; the Mid setting is equivalent to ALI#(X,MID)
and the End setting is equivalent to ALI#(X,VAL).
You can quickly set the alignment of all lines on the chart by clicking the ‘all’ link on the right-hand side. Note that this ‘all’ link is
available for all other line settings in Datastream Charting too.

Drawing a custom trend line from an equation

It is possible to draw custom trend lines in a chart according to equations for lines, such as the y=mx+c equation for a straight
line, using a time scatter chart and a 'dummy' time series DUMMYDY which counts up by 1 each day.

A simple 45o line following the line equation y=x can be drawn using the below for example. A constant (c), i.e. the Y axis
intercept, could be added by appending e.g. +5 to the end of the Y axis expression.

X axis: DUMMYDY-VAL#(DUMMYDY,VLO)

Y axis: DUMMYDY-VAL#(DUMMYDY,VLO)

Settings in the chart below include start date -1Y and daily display frequency
The below chart draws a curved trend line from the line equation y = 0.5x^0.3, using the expressions;

X axis: DUMMYDY-VAL#(DUMMYDY,VLO)

Y axis: 0.5*(POW#(DUMMYDY-VAL#(DUMMYDY,VLO),0.3))

The settings used are the same as in the chart above except for the Y axis expression.
Half-year frequency

There is no direct setting in Datastream to retrieve a series at Half Year frequency, however the OFFSET function in Excel could
be considered as a possible alternative.

In the below example, the JPABLALB series is retrieved at default frequency using a standard DFO Time Series request, then
this formula is used in every cell from H3 to H10:

=OFFSET($C$3,(ROW()-3)*2,0,1,1)

And similarly, this formula is used in every cell from I3 to I10:

=OFFSET($C$3,(ROW()-3)*2,1,1,1)
Line of best fit / trend line

A line of best fit can be added to a Scatter chart by selecting from the options in the Statistics section in the scatter's settings
window. Parallel Standard Deviation bands can also be added here, if desired.

Summary Statistics can also be added using point-and-click Annotation settings in a Text Box for example:
Trend Lines can also be added to standard time series line charts, using the fly-out menu as below. This is a shortcut to apply
the relevant global expression, in this case:

080E "LINEAR TREND LINE"

REGV#(CSUM#(1),X)
Not displaying dates where values are #NA in DFO 3.0

Datastream by default will display all dates for a period in a Time Series request – so for example the 5 days in the week
(Monday to Friday). However, it is now possible to not display dates where the corresponding values are ‘not available’, if
preferred, using the below frequency parameters in DFO 3.0 formulas.

At the time of writing, these frequency parameters are not yet available to select using the menu options within the DFO Time
Series Request window. In the meantime, they are available to be used by editing DFO 3.0 formulas directly in the formula bar
of the worksheet after running a standard request.

So for example – if the national holiday dates at the end of the year are not required – you can use the formula below –

=DSGRID("BARC","P#S","2016-12-22","2017-01-
06","DN","RowHeader=true;ColHeader=true;Heading=true;Code=true;DispSeriesDescription=false;YearlyTSFormat=false;Quart
erlyTSFormat=false","")

To display the series without the national holiday dates (Monday 26 th Dec, Tuesday 27th Dec and Monday 2nd Jan ) – see values
below –
Removing capitalization in AFO/DFO output

Text output from Datastream is returned in capital letters. If you would prefer to alter the format to Proper case where only the
first letter of each word is capitalized, use the Excel =PROPER function to achieve this.

Using SEDOL and CUSIP identifiers

It is possible to use SEDOL and CUSIP identifiers as input for Datastream requests, but the codes must be prefixed in order for
them to be recognised - this is in order to avoid conflicts with other types of input code.

SEDOL identifiers, particularly those which are alphanumeric, must be prefixed with "UK" (regardless of market) and CUSIP
identifiers must be prefixed with "U".

So for example:

The SEDOL for Visa 'A' is B2PZN04 - therefore to use this on Datastream, the input code should be entered in a request as
UKB2PZN04
The CUSIP for Apple is 037833100, therefore the input code should be U037833100

A quick way to apply this "U" or "UK" prefix for a list of identifiers is to use a simple Excel formula, as below - i.e. ="U"&... for a
CUSIP or ="UK"&... for a SEDOL.

Note that Excel removes any leading zeros from numbers entered in a cell, by default - and CUSIP and SEDOL identifiers often
do begin with zero. It is best therefore when working with these identifiers to either format the cells to Text rather than General
before entering/pasting the identifiers into the worksheet, or use the TEXT("000000000") Excel function as below, to ensure that
the CUSIP identifier is always 9 digits long. Similarly, TEXT("0000000") can be used to ensure that a SEDOL identifier is always
7 digits long.

In this example, the DFO request in cell G2 references the list of prefixed codes created in column E.

Using the DS.SYMBOLLOOKUP datatype to find the best matching Datastream series from a series description

The DS.SYMBOLLOOKUP datatype returns the best 5 matching series for a description entered in the series field. For
example - =DSGRID("BARCLAYS","DS.SYMBOLLOOKUP","Latest Value","","","RowHeader=true;ColHeader=true")

will return the following -


– with the matches based on the same search engine as used in the series suggest/type ahead in the Datastream for Office and
Datastream Charting interfaces.

We are working on extending this so that you can specify the data category of the series to return – and the number of results to
return

Using VBA commands to customize your sheets and inhouse


time series uploads - Datastream For Office (DFO) 3.0
Refresh methods

Public methods in the underlying API are available to refresh formula using VBA macros - this includes methods to refresh
selected formula/active worksheet/active workbook -

Application.COMAddIns("PowerlinkCOMAddIn.COMAddIn").Object.RefreshSelection

Application.COMAddIns("PowerlinkCOMAddIn.COMAddIn").Object.RefreshActiveSheet

Application.COMAddIns("PowerlinkCOMAddIn.COMAddIn").Object.RefreshWorkbook

If the selected range contains DSGRID and TF(Deal Analytics) formulas, it will refresh both the formulas, but not other TF
formulas.

Methods to display Navigator with a specified category

A public method to display the navigator interface and return the selected series to the cell that has the focus

Application.COMAddIns("DFOAddInExcel2010").Object.Navigator("Series", "category Id", "Row_wise")

A public method to display the navigator interface and return the selected datatype to the cell that has the focus.

Call Application.COMAddIns("DFOAddInExcel2010").Object.Navigator("Datatype", "", "Column_wise")

The following are the category Ids –


There are the following orientation options -

Column_wise - displays several select series/datatypes in a column

Row_wise - displays several select series/datatypes in a row

The isRIC parameter should be set to TRUE to return the RIC, and FALSE to return the Datastream code/mnemonic (and
should be FALSE for datatypes).

For example:-

RequestType = "Series"

CatergoryID = "1"

Orientation = "COLUMN_WISE"

isRic = "FALSE"

ThisWorkbook.Sheets("Data").Range("A2").Select

Call Application.COMAddIns("PowerlinkCOMAddIn.COMAddIn").Object.Navigator(RequestType, CatergoryID, Orientation,


isRic)

Methods to control the User Created Time Series (UCTS) template sheet

A public method to reset the template sheet removing all the dates, the header block and the values - btnReset_Click

A public method to upload all the series in the template - CustomUpload

A public method to set the date range in the template - CustomSetDateRange

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy