(hzl'murfi .u k; ' t id: ”"Hfi’ms lllllllllllWlllilllHHHlWlllillIHIIHWIIIIIHHllllll 3 1293 00897 3152 P. vet'.‘ This is to certify that the dissertation entitled Strong Markov Properties for Markov Random Fields presented by Kimberly Kay Johannes Kinateder has been accepted towards fulfillment of the requirements for Ph.D. degree in Statistics ~f ‘W Date Nov. 5, 1990 MSU is an Affirmative Action/Equal Opportunity Institution 0—12771 m uBWJKY ‘ Illehlgn lute University 1 PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. DATE DUE DATE DUE DATE DUE MSU Is An Affirmative Action/Equal Opportunity Institution c:\circ\datedm.pm3-p.1 STRONG MARKOV PROPERTIES FOR MARKOV RANDOM FIELDS by Kimberly Kay Johannes Kinateder A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY Department of Statistics and Probability 1990 é¢6~ 53 M: Abstract Strong Markov Properties for Markov Random Fields by Kimberly Kay Johannes Kinateder Markov properties and strong Markov properties for random fields are defined and discussed. Special attention is given to those defined by I. V. Evstigneev. Various definitions of measurability for set-valued functions have been defined. These definitions are shown to be equivalent to each other for compact domain- valued functions, called random domains. The strong Markov nature of Markov random fields with respect to random domains such as [0, 1'1] and [71,72] are explored, where 1'1 and 1'2 are stopping times. This concept is extended to higher dimensions by introducing an extension of stopping times called membranes. A special case of this extension is shown to generalize a recent work of Merzbach and Nualart. Finally, the so-called corner Markov and strong corner Markov properties are introduced, and the strong corner Markov property is proven to hold under some conditions which include a Cairoli-Walsh (F4) type of condition. The strong Markov nature of reciprocal Markov processes is explored using techniques of Stroock and Varadahn. This thesis is dedicated to the glory of God, in Whom all things are possible. Philippians 4:13. Acknowledgements The author would like to thank Professor V. Mandrekar for his guidance throughout my years at Michigan State University. She would also like to thank Professor Raoul LePage, Dennis Gilliland, and V. Sreedharan for their time and interest. In addition, she would like to thank those from the Office of Affirmative Action at Michigan State University who arranged for her support under the Patricia Roberts Harris Fellowship Program. Finally, the author would like to thank her family — John, Dad, Mom, Jeff, and Tammy — for their steadfast support and confidence in her. Contents List of Figures. .......................................................... iv I Introduction ......................................................... 1 2 Definitions and Preliminary Theorems ............................. 3 2.1 Markov properties ................................................. 3 2.2 Strong Markov properties ......................................... 12 3 Special Random Domains .......................................... 17 3.1 Random domains of the form Q = [0, 1'] when d = 1 .............. 17 3.2 A counterexample ................................................. 20 3.3 Random domains of the form Q = [0, 1'] when d > 1 .............. 21 3.4 Stopping lines ..................................................... 22 3.5 Random domains of the form Q = [71,12] ........................ 29 4 Corner Markov and Reciprocal Processes ....................... 31 4.1 The corner Markov property ...................................... 31 4.2 A martingale approach ............................................ 41 Bibliography ........................................................... 43 iii List of Figures 2.1 Membrane 1H (d = 2) ............................................. 11 2.2 [0,M1] and [M1,il/12] for M1,M2 E M such that M1 S M2 ....... 11 4.1 Corner Kg ........................................................ 32 Chapter 1 Introduction The study of Markov properties for random fields was initiated by Lévy [Lev48]. McKean [McK63], Molchan [M0171], Pitt [Pit71], Kallianpur and Man— drekar [KM74], and Kunch [Kun79] studied necessary and sufficient conditions for Markov properties of Gaussian random fields. In [Man83], Markov prop- erties for general random fields were studied. Evstigneev initiated the study of strong Markov properties in multi-dimensions by introducing Markov times in [Evs77] and proposed a necessary and sufficient condition (splitting) for this strong Markov property in [Evs82]. In [Ev588], Evstigneev presented a nonantic- ipating sufficient condition for a Markov random field to have his strong Markov property with respect to a set-valued random function (specifically, random do- mains). We shall refer to this condition as (2.6). Rozanov [R0282] also explored set-valued random functions and strong Markov properties for multi-dimensions. This study has found applications to various fields ([Dud82], [Nel73], [Sim74]). 2 For ’Ri , our purpose is to systematically study Evstigneev’s strong Markov property for random domains related to stopping times. In order to extend these results to 721 with d > 1 , we introduce random membranes as an analogue of stopping times and study Evstigneev’s strong Markov property with respect to random domains related to random membranes. As an example in 712+ , we show that the so-called decreasing stopping lines occuring in a recent work of Merzbach and Nualart [MN90] on point processes are a special type of random membrane that satisfies condition (2.6). [MN 90] presents another strong Markov property and we show that, under some natural assumptions, condition (2.6) is sufficient for a point process to have this strong Markov property with respect to a decreasing stopping line. In Cairoli and Walsh [CW78], a one-dimensional property (F4) is used to investigate two—dimensional Markov properties. In Chapter 4, we study one- dimensional strong Markov properties and relate them to two-dimensional strong Markov properties. Finally, the study of strong Markov properties for reciprocal Gaussian processes is undertaken. The results obtained present a good beginning to this study, and some methods from [Str87] are clarified. Since reciprocal pro— cesses play a major role in various applied problems, it appears that the study of strong Markov properties for reciprocal processes may have a significant impact on applications. Chapter 2 Definitions and Preliminary Theorems 2. 1 Markov properties Let (Q,f',P) be a complete probability space. Throughout this thesis, we assume that all sub- 0 -algebras of .7 introduced will contain all sets of measure 0 from .7: . Since our goal is to determine when certain random fields have a strong Markov property, we need to understand the simple notion of a random field. Definition 2.1 Let E = {ahERi be a family of random variables defined on (9,.7, P). We call 6 a random field. Example 2.1 One of the special cases of random fields that we study is the point process. In [MN90], the point process for d = 2 is defined as follows. Let N be a random measure on 723. such that N (w) is a finite or countable sum of Dirac measures on random and different points Zi(w), i = 0,1,... . We also assume that N({z =(z1,22) E 721: 0 _<_ zigti,i=1,2})< oo, 4 for all t = (t1,t2) E R3. and that the measure of the axes is zero. The point process 6 is then defined for t E R3. by {t = N({z = (21,22) 6 721:0 g z,- 3 ti, 2' = 1,2}). Associated with a random field 6 we define a -algebras Q; E a(€t,t E A) for A Q R1 and germ—field a -algebras (>0 for A _C_ Rfi. , where A6 = {t e 721 : d(t,A) < 6}. Throughout we assume 9R1 = .7". Using the above a -algebras, we can express several different Markov proper- ties for a random field é . For this we need the concept of conditional indepen- dence. Definition 2.2 Let A,B, and g be sub a-fields of 1". We say that .A and B are conditionally independent given g , if P(AnB|g)=P(A|g)P(B|g), for allAEAandBEB, where P(- I Q) is the conditional probability given 9 . We denote this by A .LL [3 I g . In our case, P(. I Q) will be an equivalence class. The first Markov property that we shall introduced was proposed by Evstigneev [Evs88]. A subset A C_: R1 is called a domain if A Q 25. Let T denote the set 5 of all compact domains in R1. Assume 0 E T. A random field 6 is said to be Markov with respect to B E T if for all A, C E T with A Q B Q C, fBJLfFIF/fia- Definition 2.3 We shall say that a random field 6 is Markov if f is Markov with respect to B E T, for all B E T. Given this definition, a natural question arises. When at = 1 , how does this Markov property relate to the Germ-field Markov Property (denoted GFMP), (110,” _lL fi,,m)|f{,}, for all t 2 0) and the classical Markov property (gm _LL gum” 9“}, for all t Z 0) ? The answer is given below. Lemma 2.1 If f has the classical Markov property, then C is Markov. Before proving the claim, we will state a few results dealing with conditional independence. Proofs for Propositions 2.1 through 2.4 and Theorem 2.1 can be found in [Man83], pp.163—167. Proof of the Proposition 2.5 is given in [R0282], p. 58. Proposition 2.1 If A,B,g’, and g are sub-a-algebras of .7: with .AJLBIQ and Q'QQVB, then AlLQ’IQ. Proposition 2.2 Let {033i 6 I} be disjoint open subsets of X and U = UiEI 0i- If gag 'U' g0: lgao“, for all i E I, then 05 iL Gfic lgaU- We call the latter condition the simple Markov property on [1. Proposition 2.3 6 has the simple Markov property on all open subsets of X if and only if 6 has the simple Markov property on all open intervals in X. Theorem 2.1 If E has the simple Markov property with respect to a. set A, then f] gel 0 gal fl 90- open 0234 open 02? open 026 That is, 6 has the the germ-field Markov property (GMFP) on A . Proposition 2.4 5 has the GFMP on a set A if and only iffor every open set 0 2 0A, GA JL 9A: I90. Proposition 2.5 If {gn},, are monotonically decreasing sub-a-algebras of f such that .A .lL mg. for all n E N, then AJLBI fl 9". n=1 Proof of Lemma 2.1. Assume if has the classical Markov property. Then 910.1] -”- g(t,oo) |0(€t), for all t 2 0. 7 From gm] = 910,1) V a(€t), gym) = 90,00) V 0(fg), and Proposition 2.1, we get that C has the simple Markov property on sets of the form [0, t) and (t, 00) . By Proposition 2.2, 6 has the simple Markov property on sets of the form [0,3) U (t,oo), s < t. By Proposition 2.1, glOvSlUllOO) _ll_ g(s,t) lQ{s,z}, for all t > s; that is, f has the simple Markov property on all open intervals in (0,00). Propo- sition 2.3 now gives us the simple Markov property on all open sets in (0, oo) . The GFMP on all open sets follows from Theorem 2.1. Now let a, b E T with a Q b. Our goal is to show fa—c _lL Jill—“Tn; Let 6 > 0, A 2 b0, O = (my, and apply Proposition 2.4 to get gbo Jl. g(bo)c |g(a.nb).. Note that gb. = Q'bo VgG'cfiEy and g—(ac)( : g(bo)c V g(a°—nb)" Hence gill—63‘ _ll. gbc Ham), by Proposition 21. Finally, we apply Proposition 2.5 and the facts that .77: Q Q—(ac), , for all 6 > 0 and .771, Q gbe, for all e > 0 to get f? 11— fb lf-fiTib' Therefore, 5 is Markov with respect to all a,b E T with a Q b. D The converse of the above claim is not true. A counterexample for this comes from A = (0,t) and X(t) +X(t) = B(dt), t Z 0, X(0) = X(0) = 0, where B is Brownian motion. By [D0044], {Xt,t > O} has the GFMP for all A, but it does not have the simple Markov property for all A . Another Markov property for R: that we need is the reciprocal Markov property. Definition 2.4. 6 has the reciprocal Markov property if 9W] JL g[0,s]U[t,oo) | gm} fora110_<_s_ 0 and a two—sided stopping time if {“13 r 3 ug} E gluhuzl, for all W Z u1_>_ 0. Observe that r is a stopping time if and only if {w : [O,r(w)] Q [0,u]} E gym] for all u_>_ 0. In order to generalize the concept of a stopping time to higher dimensions, we need to define measurability of set-valued functions ([Evs77], [Evs88], [MN90], [R0282]). Define a a- algebra T on T by letting T=a({A€ TzAQ U} : U is anopen subset ofRi). It is easy to prove that T can also be generated by sets of the form {A 6 T : A Q U} where U ranges over all Borel subsets of Rff. . For further information, see [Ev388]. Definition 2.6. A measurable map from ($2,?) to (T,T) is called a random _dpma'm [Evs88]. That is, a random domain is a compact domain-valued function on it such that {QED}€.7, for all DET (2.1) 9 It is worth noting that there are other ways to express this measurability (2.1). Lemma 2.2 If Q(w) E T for all to E Q, then the following conditions are equivalent to (2.1) {QnAfi0}ef, for all AeT (2.2) {Q g A} e .7-' for all A e T. (2.3) {Q Q B} E f, for all open subsets B E R1. (2.4) Proof. (2.1), (2.2), and (2.3) are equivalent using [Evs88], p. 31. The equivalence of (2.3) and (2.4) follows from the relations m — {Q s B} = U {Q s 134/") n=1 for any open domain B in R1 where B‘l/n = {t E R1 : d(t,BC) > 51-} and m {Q Q A} = fl {Q Q Al/"} for any A in T. n=1 [:1 Condition (2.4) is the definition of measurability used in [R0282]. Rozanov also noted ([R0282], p. 80) that the measurability of a T-valued map on Q can be expressed in terms of the measurability of random variables, as described in the next lemma. Lemma 2.3 Suppose Q(w) E T for all to E 9. Then IQ(z) is a. measurable random variable for each 2 in R1 if and only if Q is measurable. We shall also study the strong Markov property for point processes. In order to relate this work to that in [MN90], we need the definition of measurability from [MN90]. In [MN90], a random set Q is said to be measurable if {Q (1 A # ill} 6 .7", for all open A Q R1. (2.5) 10 Once again, this definition of measurability is equivalent to Evstigneev’s definition of measurability when Q takes values in T. Lemma 2.4 Assume Q(w) E T for each to E 9. Then (2.5) holds if and only if (2.1) holds. Proof. (2.1) implies (2.5) by the relation {Q n A ¢ ¢} = {Q s new for open A, the equivalence of (2.1) and (2.3), and the fact that (A6)” 6 T. The reverse implication follows from the same relation after noting that any B E T can be written as ((BC)C)° and that Bc is open. El Let C be a collection of sets C . A random set-valued function D is said to be compatible with C if {DQC}E.7-'C, forall CEC. A random set-valued function D is said to be co-compatible with a collection of sets C if {DQC} 6.770, for all CEC. As mentioned earlier in this section, an example of a random domain when d = 1 is the random interval Q = [0, r] where r is a positive random variable. We provide an extension of this concept to R1 . Definition 2.7. A Rf‘l -membr_ane is a subset M Q R1 such that if (1 =1, then M = {m} for some m 6 (0,00), and if d 2 2, then Ill satisfies (1) and (2) below: (1) It! (1 {u E R1 : U, = 0} is a Ri—l -membrane in {u E R1 : u,‘ = 0} for all i=1,...,d. 11 (2) there exists a continuous one-to-one map 7;Bd_1(0,1)nRi-1 _. R1 such that 7(Bd_1(0, 1) D 721.1) = .M. We call it a membrane because the first exit set 1’ ‘attaches’ itself to each axis ((1 = 2) or axis plane (d = 3) and ‘stretches’ itself between the axes (d = 2) or axis planes (d 2 3). Figure 2.1 membrane Al (d = 2) y-axis 1W m-axis We can define sets of the form [0, M] and [M1,1l42] for d > 1 dimensions in a natural way using membrane theory as described below. Let M denote the class of all membranes. If M E M , let [0, M] E {u E R1. : u lies on some polygon p in R1 which connects 0 to an element v E M and such that p 0 M = {11)}. Define a partial ordering S on M by M1 _<_ M2 ifand only if [0, M1] _C; [0,M2]. Also define [AILMQ] = [0,M2] fl [0,M1]c for .Ml,M2 E M such that All S M2. 12 Figure 2.2 [0,Mi] and (1‘11.le for Alhllfg e M such that MI 3 Mg y-axis 2.2 Strong Markov properties Now let us recall the classical 1-dimensional strong Markov property. As men- tioned in Section 2.1, a nonnegative random variable 1' is a stepping time if {1- g u} 6 gm] , for all u 2 0. Define the stopped a -algebra .75} by f7={A€f: Afl{r$u}€9[0,u], forall u_>_0}. Definition 2.8. f is said to be W if for every stopping time 1' , P(€t E C'Hfr) = P(£¢ E C||a(€1-)) as on {t > 7'} for all t Z 0 and C E B(R). Intuitively, this says that the information from the process after the random time 1' is conditionally independent of the future information after the random time T , given the information from the process at the random time 1'. As the extension of the Markov property to R1 , d > 1, corresponds to the conditional independence of the information of the process in the ‘interior’ and ‘exterior’ regions of a set given the information on the boundary of the set, the extension of the strong Markov property to R1, (1 > 1 , corresponds to the conditional independence of the information of the process in the ‘interior’ and ‘exterior’ 13 regions of a random domain given the information on the boundary of the random domain. Strong Markov properties of this type were proposed by Evstigneev ([Evs88] and [Evs77]) and Rozanov [Roz 82]. Exploring Evstigneev’s most recent strong Markov property is the main purpose of this thesis, and it is defined below. For notational convenience, define «41M, 13) = 7:13 A2(A, B) = F]; «4304, B) = W for A,B g 721 with A _c_ B. Let c > 0 and a,fl be random domains such that 01(0)) _C_ fl(w) for all 00 E 9. Let i E {1,2,3}. Define Af(a,fl) to be the a-algebra generated by a,fl and sets of the form {Fe Ame 2 BmI‘ where F E Ai(A,B) and a—‘(w) = {t Z 0 : d(t,a(w)c > e}. Let Ai(a,fl) E 000 Af(a,fl). Definition 2.9. A random field 6 is t a k v with r e to a random domain Q if, for every two a(Q)-measurable random domains or, ,8 with a(w) Q Q(w) Q ,B(w) for all w E Q, A1(C¥, fl) J-L A2(a$ fl) |A3(OI,,B)- Evstigneev [EvsSS] proved that each of the following conditions on a random domain Q is sufficient for a Markov random field 6 to be strong Markov with 14 respect to a random domain Q: {Q Q B} E 7:3, for all B e T (2.6) {Q 2 B} e .7-‘3, for all 13 e T (2.7) The following lemma is helpful in relating condition (2.6) to some more com— mon conditions. Lemma 2.5 Let Q be a random domain. (i) {Q Q A} E .7714, for all open subsets A of R1 is equivalent to condition (2.6). (ii) (2.6) implies {Q Q V} E fv, for all compact intervals V of R1 . (iii) If Q = [71,72] for some random variables 71,12 such that 0 _<_ 110») < 12(w) < 00, for all to E Q, then {Q Q V} E TV, for all compact intervals V of R1 implies (2.6). Proof. (i) We shall first show that {Q Q A} E .77 A for all open subsets A of R1 implies condition (2.6). Let V E T. Since 00 fvc = fv, it is enough to 6) show that {Q Q V} E TV. for every 6 > 0. Let c > 0. Then {Q Q V} = {'1 {Q Q Vii} E fve and we are done. Now assume condition (2.6) and let n=m+1 A be an open subset of R1 . Using that A is open and Q(w) is closed for each wen, {ow}: gnu-her... (ii) Consider the compact interval V = [a, b] . If a = b, then V = {a} and {Q Q V} = (b E 7v , because Q(w) is a domain for each to E Q. If a < b, then V E T and {Q Q V} E fv by condition (2.6). 15 (iii) In order to prove (2.6), we shall show that {Q Q V} E fve for every VET and e>0. Let VET and e>0. Then 00 {09V}: 0 U {Q§[8.tl} n=lfil+1 ssevth; [3;]th is an element of fv1 Q fvc and the proof is complete. D We will later show that condition (2.7) is not equivalent to E having the strong Markov property with respect to Q. The next theorem describes the relationship of a random domain Q with the random set “Qua. Theorem 2.2 Consider the random field E = {fthep where D is some compact subset of R1 . (i) Q is a random domain if and only if Q? is a random domain. (ii) 6 is strong Markov with respect to Q if and only if 5 is strong Markov with respect to Q5. Proof. (i) Let Q be a random domain. Since Q is closed, we have that QC is open and thus Q3 is a domain. Hence Qc(w) E T, for all to E 9. Using results (2.10) and (2.15) on pp. 79,80 of [Roz82], we can get the measurability of Q . Therefore Q“ is a random domain. The reverse implication follows by the above and the fact that (Q5)C = Q. (ii) Assume 5 is strong Markov with respect to Q. Let a and ,6 be 0(3)?)- measurable random domains such that a Q Q? Q fl. Note that for an arbitrary A E T, it holds that 71-5 = A, 8A0 = 8A, and (715)“ = A0 ([Roz82, pp. 80-81, (2.17), (2.18)]). Thus {Q _C. A} = {62" g A”} = {(Q")“ 2 (A”)°} = {"6973 2 7F}. 16 Since 715 E T and T is generated by sets of the form {B E T : B 2 C} where C E T ([Ev388], Lemma 62(3)), we get that 0(Q) = 4?). Combining this, 0(a) = 0(3), and 0(6) = a(—fl_c), it follows that E and F are o(Q)- measurable. By the assumed strong Markov property, A1(F’E) J‘L A2(E—E)a-E) l “43(Fva—é) In order to prove the desired strong Markov property, it is enough to show A1(FE,a—E)= “42(av18)! (2'8) A2(I—'§éa?) = A1(a1fl)? and (2'9) sum—6,3?) = A3(a, s). (2.10) For e>0, A,BET with AQB and PEA,‘(A,B), iE {1,2,3}, {FQA}0{F2B}HF={(F)C2F}fl{(fl_—‘)c QEE}0F ={(?)“‘2F}H{W§F}HF- Moreover, A1(A,B) = .73 = FEB—77 = A2(F,AF), A2(A,B) = .732; = A1(F,AE), and = A3(—B.E,F). Aid/1,13) = 3,4an 2 W07? Hence Ai(a,fl) = §(-fl_c,?), §(a,,6) = {(Fir—é), and A§(a,fl) = $33,376). Therefore (2.8) holds. (2.9) and (2.10) can be proven in a similar fashion. The reverse implication follows from (air = a and CST)" = fl and the same techniques used above. El Chapter 3 Special random domains We will now look at some specific random fields { and random domains Q such that 6 has the strong Markov property with respect to Q. 3.1 Random domains of the form Q = [0,7] when d: 1. In this section, two theorems with different hypotheses and identical conclusions will be stated for d = 1. The second theorem is decidedly stronger than the first theorem. However, the proofs of the two theorems use different techniques. The proof of the first theorem uses an interesting result of Rozanov and condition (2.6). The second proof uses only (2.6) and is most similar to the rest of the proofs in Sections 3.3 and 3.5. Theorem 3.1 Let 6 = {50120 be a Markov random field with continuous sample paths. Assume there exists some open set A0 Q R such that {0(w) E A0, for all 18 w E 9. Define the random set A(w) = {u 2 o = saw) 6 A0} and assume that /\(w) is bounded for all to E 9. Then 6 is strong Markov with respect to the random domain Q = [0,1'], with T(w) = inf{u Z 0 : £u(w) ¢ A0}. We call T the first exit time from An. Before we prove this theorem, we need to state a result of Rozanov ([R0z82], p. 82). Theorem 3.2 Let (9 denote the collection of all open subsets of R3, and let uo be a fixed point in RI. If D is a random domain, co-compatible with (9, then the connected component D0 of D containing uo is compatible with 0. Proof of Theorem 3.1. We will first show that T is a random domain. Note that the set /\(w) is open in X since A0 is open and {(w) is continuous for all u) E fl. Thus Who) 2 The), for all to E Q; that is, X0») is a domain for all to E 0. By assumption that T0») is bounded, for all to E Q, and since The) is closed, for all to E 9, we have that The) is compact, for all a) E 0. Thus X0») E T, for all w E 9. Next we must check the measurability of A. Since /\ is open, it is enough to check if {A 2 A} E .7, for all A E (9 ([Roz82], pp. 79—80). Now {A2A}={X2A}=fl 0 U {éVEAo}, (>0uEAnQ VEB(u,c)nQ 1.9 which is an element of .7 A , for all A E 0 using that A is open. Hence T is measurable, and thus T is a random domain. From {:1 2 A} E 7/1, for all A E (9, we also get that T is co-compatible with the family {0(Eu, u E AllAEO- Applying Theorem 3.2 with D = 3: and an = 0, yields that Q = [0,7] is compatible with the family {gAlAeC}, where r = inf{u Z 0 : {u ¢ A0}. Let B E T. Then {Q Q B} = 2:1{Q Q 131/71} since B is closed. Also, {Q Q Bl/fl} E 98% , since Bil? is open and Q is compatible, for all n E IN. So {Q Q B} E 030:1ng = .73 since ng Q 0%S€<;%f 63:, for all n. This inclusion and condition (2.6) yield that 6 has the strong Markov property with respect to Q . E] It turns out that Theorem 3.1 holds for any positive stopping time r, not just exit times. The path continuity and initial conditions on é can be removed to yield the result given below. Theoreln 3.3 If C is a Markov random field and r is a positive finite stopping time, then 5 is strong Markov with respect to the random domain Q = [0,1']. Proof. We must first show that Q is a random domain. Certainly, using that 0 < r(w) < 00, for all to E 52, we get that Q(w) E T, for all to E Q. Moreover, given V E T and e > 0, {Q g V} = ii {[0,7-19; Vt} n:[-:- +1 = a U {r_<_s}. n-----[-.‘-]+1 set/tuna); [0,s1gvt L—l The latter equality comes from the proof of the following lemma. 20 Lemma 3.1 {tents/i}: U {r33} .96th02; [0,s]gV% for arbitrary n E N. Proof. If u E [0,r(w)] and there is an s E Vi (N) such that [0,3] Q Vii and r(w) S s , then u E [0,3] Q Vii. Hence ' U {r33} 9 {tons Vi}. sevt‘nQ; [0,s]QV?i‘ Furthermore, if {0, r(w)] Q Vi , then by Vii—(101w), 00) being open, there exists some rational s E Vi such that [0,r(w)] Q [0,s] Q Vii . In particular, r(w) S 3. Thus U {.- s s} 2 {[0,7] 9 vi}, sEVAnQ; [O,s]QVb and the lemma is proven. Continuing with the proof of Theorem 3.3, we have {Q Q V} E fyc for each 6 > 0 and {Q Q V} E .7V. Hence Q is a random domain and 6 is strong Markov with respect to Q using condition (2.6). D 3.2 A counterexample We are now ready to construct an example of a process 5 and a random domain D such that { is strong Markov with respect to D but D does not satisfy (2.7). Let go , E1 , and E2 be independent random variables. Define 3 ,. .. {t = Z til-11(n—1,n](t) +€ol{0}(t) for t 6 [0,31 n=l 21 Suppose that go lies in some open set A0 and let 7' = inf{t Z 0 : ft ¢ A0} . Furthermore, assume that 7(a)) < 00 for all to E Q and 0 < P(r 2 1) < 1 . Note that .f is classical Markov since gm] 2 0(60) is independent of 451,52) 2 gm]. Thus 6 is Markov and has the strong Markov property with respect to the domain Q = [0, T] by Theorem 3.3. By Theorem 2.2, § also is strong Markov with respect to the random domain QC- : [r, 3]. However, {Z27 211,31}={1~r,312[1.31} = {T 2 1} = {r < 1}” E 9m,” since {7' < 1} = 1:)ng g 1- ;1,-} and {r g 1 — %} e g[0’1_1] g g[0,1] for each 72. Suppose {Q0 2 [1,3]} 6 gm]. Then {Qc 2 [1,3]} 6 910.110 g[1.3] and, since gm] is independent of 911.3], P(T 2 1) = Mae" 2 [1,3]) = 0 or 1. This contradicts our assumption that 0 < P(r 2 1} < 1. Hence {'QE 2 [1, 3]} e g[l,3] and condition (2.7) does not hold. 3.3 Random domains of the form Q = [0,r] when d > 1 Recall that membrane theory was discussed in Section 2.1. The extension of Theorem 3.3 now follows naturally. Theorem 3.4 Let 6 be a Markov random field and let 7' : Q —t M be a map such that {T _<_ 114} E g[0,M], for all M E M (3.1) 22 Then £ has the strong Markov property with respect to the random domain Q = [0, r] . Proof. We need only to show that Q is a random domain and that (2.6) holds. Since Q(w) E T, for all to E Q , we may conclude that Q is a random domain once we have proven (2.3), as was done in the proof of Theorem 3.3. For arbitrary VET and e>0,define d ’H’f/={MEM:[0,M]QV1/", MD(U{uER1:uj=0forj #21591} i=1 Then given any VET and e>0, {09V}: (1 U {QEM} n=[-l;]+l ME”? 6 0 9w (>0 which equals .7V using the hypothesis. [3 3.4 Stopping lines In [MN90] a special type of membrane is defined for d = 2. Let M be a R1-membrane. Define Q(u) = M {'1 {x E R1 : x1 = u} for u E [0, 1]. We shall say that M is a W113; if Q(u) is a singleton for each u E [0,1] and the map Q on [0, 1] is non-increasing, where Q(u) is defined to be the y-coordinate of Q(u) . Let L(w) be a decreasing line for each to E Q . Then L is said to be a stepping line if {2 = C; S L} E 9px;]. for all z E R1 where CzE{$ER1:xl=21,OS$2_<_22}U{:BER1:x2=22,03x1$21}. ~ 23 Let N be a point process and L be a random membrane. Upon application of Theorem 3.4, if N is Markov and satisfies {L S M} E g[0,M] for M E M (3.2) then N is strong Markov with respect to the random domain Q = [0, L] . Evstigneev showed that there is another strong Markov property such that N is strong Markov with respect to Q which also follows from condition (3.1) for N and Q. This strong Markov property is A(Q) -U— A(7) I #1090) where A5(D) E 0(D,{7 Q B} F] F, B E T, F E 913) for 6 > 0 and A(D) = fl A€(D) for any random domain D ([Evs77]). e>0 Merzbach & Nualart define the following strong Markov property: For a set—valued random function D, let H1) be defined to be the a-algebra generated by D-1(T) and the random variables 1D(z)Nz, z E R1. We say that N has Merzbach & N ualart’s strong Markov property with respect to D if 710 .11. H5; I “60- Merzbach & Nualart’s strong Markov result is stated below. Theorem 3.5 [MN90] If L is a decreasing stopping line and N has the simple Markov property with respect to sets of the form [0,5], t a decreasing line, then 'HQ JL’HQ'C‘I'HL. 24 Next, we compare sufficient conditions on L. Lemma 3.2 If L is a decreasing stopping line, then L satisfies condition (3.1). Proof. Let L be a decreasing stopping line. Since L is decreasing, it is enough to show that {L S M } E gm, M] for arbitrary decreasing lines M E M . Choose an M as described. Then {LSM}={L§$M}° =( U {02 SLW zEMnQ2 Now {Cz S L} E 9pc.) by L a stepping line. Moreover, gm’C‘] Q g[0,M] for z E M (1 Q1 since [0,Cz] Q [0,M] and M is decreasing. Thus {L S M} E gw’M], and we have that L satisfies (3.1). The converse of this proposition is not true, as the following example shows. Counterexample 3.1. Here we present an exmaple of a point process N and random membrane L such that L is decreasing and satisfies (3.1), but L is not a stepping line. Let N be Poisson. Recall that N is a.s. boundedly finite and without fixed atoms. We can apply Theorem 2.4 VII of [DV88] (p. 35) to get that for every finite family of bounded disjoint Borel sets {Ai,i = 1,. . . , k} , the random variables N (A1),...,N (Ak) are mutually independent. Define Mu = {(r,y):y=u—r}flR1 for u>0 and L(w) = inf{Mu if{8(w)1[0,Mu]fl£.(u)‘1([3,oo))(3)’\2(d3) 2 6}. 25 Assume that z E R1 and {CZ S L} ¢ {0,0} . Then given any Mu , {L s. Mu} = {w = / €s(w)1[o,Mu]ng.(o)-1([3,oo))(8)/\2(d8) 2 6} E g[O,Mu] Thus L satisfies (3.1). However, for z E R1 , {02 S L} = {L < Mzi+zzlc U {L _<_ My} VEQ:D(O,21+22) E g[0,M31+32]’ By independence of N([0,le+z,]) and N([0,C'z] fl [0,Mz1+z2]c), it does not hold that {0; S L} E g[0,Cs] . Thus L is not a stepping line. The above lemma and counterexample show that (3.1) is a weaker assumption on L than the assumption that L is a stepping line. However, condition (3.1) is sufficient for N to have Merzbach & N ualart’s strong Markov property. We shall new state and prove our version of Theorem 3.5. Theorem 3.6 Assume A(Q) V A(—QE) = .7 and (QC ’H[L__1_ L+i] = HL. If n=1 "’ " N has the simple Markov property with respect to sets of the form [0,6], 8 a decreasing line, and L satisfies (3.1) then HQ .11. Ha? I H1, Remark. The condition A(Q) V A(QE) = .7 is clearly satisfied if Q is a 00 deterministic compact domain. Also, the condition 0 H[L— _1_ L + 1 1 = ’HL is 11:] "’ " 00 similar to the condition 01%“, Ln] = HL which is proved in [MN90] for L a n: decreasing stopping line and {Ln} a sequence of decreasing stopping lines which converge to L . 26 In order to prove this result, we will need a few lemmas. Lemma 3.3 If N has the simple Markov property with respect to set of the form [OJ], L a decreasing line, then N is Markov. Proof. Let B E T. We must show that .7? .11. .70 [ W for all A,C E T with A Q B Q C. Certainly 9“”) 11 gm]. [ 95 for any decreasing line 6. Moreover, gm) _U_ Qmoc I Qg using Proposition 2.1 and gm). Q 910.5? V 95. Thus, by Proposition 2.2, we know that GD JL 91)? I 961) for all sets D of the form D = [0,6) U [0,s]c where e S s and l,s are decreasing lines. This, of course, is equivalent to 90 .11 95c IgaD for D=(€,s). Given any open convex set 9 Q R1 , there exists a collection {(Ei, 3i)}iEN of pairs of decreasing stopping lines such that E,- S s, for each i E IN, U (35,35 — i) = B, i6 and (6,33,) 0 (t’j,s_,') = (15 for i 761'. Thus, by Proposition 22, N has the simple Markov property on 9; that is, grngaflgao- 27 According to Corollary 3.1 of [Man83], this is equivalent to N having the simple Markov property on all open subsets of R1. Hence, N has the GFMP on all open sets by Theorem 2.1, and the rest of the proof is identical to the last part of the proof of Lemma 2.1 with a = A and b = B . Cl Lemma 3.4 (i) HQ? Q A(Q) V 245(1)) and HW Q 1“?) V A€(L) for all 6 > 0 (ii) A(Q) 9 Ho and A?) g "rt—Q: (iii) Asa?) = H[L—e,L+e] Proof. (i) We shall first show Ha; Q A(Q) V A€(L) for arbitrary e > 0. Consider C E [3(R+) , z E R2 , and note that 1 e is equal to {10 (leIECl 1{lar(z)}1{Nz€C} + knack) if 0 E C 01‘ 1{15r(2)}1{NzEC} if 0 g C. Furthermore, note that (Q’s-)6 is 0(Q) -measurable and hence A(Q) ~measurable. Thus, it is enough to show 1{zeQ7}1{N,EC} is A(Q) V A€(L) -measurable. Now 1{ze©?}n{N.EC} = 1{zeo)n{N.e0} +1{ze[L—e,L+s)}n{N.eC} “ 1{zEQ}¢1{ze[L—6.L+e]}n{N.EC}' In addition, since Q(w) E T for each to E (2, {z 6 Q} 0 {NZ 6 C} = {7;}? 2 B(z,c)} n {Nz e C} 6 245(6)). 28 Thus {2 e Q}fl{Nz e C} e A(Q). Because L is decreasing, {z e [L—e,L+c]} = {IL—e, L+el 2 W} UttL—e. L+el 2 F6723} U(0_ 21, t _>_ 22} B_—(Z,2€) = B(z,2c) fl {(s,t) : s S 21, t S 22}. Since the three summands are disjoint, {z E [L —- e, L + 6]} fl {Nz E C} E AE(L). Therefore, 1{zEQ‘—}1{N;EC} is A(Q)V.A€(L) -measurable. The proof of Hm Q A(—Q_E) V A€(L) is proven in a similar fashion. (ii) We show A(Q) Q ’HQ by proving A5(Q) Q ’H-QT for all e > 0. Consider a set {_QE 2 I‘} n A where I‘ E T and A E gr. Now A E gp implies the existence of a countable collection Z Q P and a measurable function (I) such that 1A = (Nz,z E Z). Moreover, {QE 2 1"} = fl {3: E Q5} for zEI‘nsupp(N) the countable collection X = F D supp(N) Q R1 since supp (N) is countable. Therefore 1{-Q—‘2P}0A = HX 1{xe§E}(I’(Nz,Z E Z) x6 = 1,, .(1z—.Nz,zEZ) sci—tin {63'} {662} is Hat-measurable. So A€(Q) Q 716,-. The proof of A5 (Q?) Q 71W is handled in a similar fashion. (iii) Proof of A€(L) Q H[L-e,L+e] uses the same technique as in (ii) and will not be shown here. In order to show that reverse inclusion, consider 6 > 0 , z E R2 , and C E 3(R). Then 1{[L—6.L+€]2{z}}fl{N.EC} + 1{[L—6.L+e]2{z}} if 0 E C 1 1 _, t 2 NZ C ={ . { [L M I” E } 1{[L—e,L+e];{z}}n{N.eC} 1f0¢ C 2.9 Now {[L — e,L + c] 2 {2}} e A€(L) and so it is enough to show that {[L—c,L+c] 2 {2}}n {N. e C} e A€(L). Now {[L—e,L+e] 2 {2}} = (6L>JO{[L - 5.1: + e] 2 WWW - eL + e] 2 WNW — eL + e] 2 B“(z,25)} and so {[L— e,L+e] 2 {2}}fl{Nz E C} E A5(L). Thus HIL—e,L+c] Q A(L)- D Proof of Thin 3.6. By Lemma 3.3 and the fact that (3.1) holds, we get A(Q) 1L #7) I A(L)o Since A(Q) v A(Z2‘C) = .77 and A(L) c_: A5(L) c; f for any c > 0, A(Q) 11- 4'52?) | AWL). Now A(Q) V 246(L) _LL A(QE) V A€(L) I A€(L) by applying Prop. 2.1. Apply- ing Lemma 3.4(i) now gives us HQ? .11. ”(Q—0‘ I A€(L) , and Lemma 3.4 (iii) yields 7Q? .lL W I H[L—e,L+e] . By the Martingale convergence theorem, :ij Hi L— g ’ L + :1; l = ’HL. Since ’HD Q ’HDr for all set-valued random functions D Q D‘ , we conclude that ’HQ .11. H3: I ’HL . C] 3.5 Random domains of the form Q = [T1,7‘2] Another type of set-valued random function that may be of interest, particu- larly when considering reciprocal Markov processes, are of the form Q = [71,72] and T1 and 7'2 are positive random variables that satisfy 0 S 71(w) < T2(w) < 00, for all to E fl. 30 For T1 and 12 membrane-valued random functions satisfying the above inequal- ities, we have the following result for d 2 1 : Theorem 3.7 If C is a Markov random field and T1 and 7'2 satisfy {M13 7'1 < 72 S M2} 6 QIM,,M2], (3.3) for all M1 E M U {0} and M2 E M with M1 S M2 then f is strong Markov with respect to the random domain Q = [71,72] . Note: If 1'1 and 7'2 are WW that is, if for i = 1,2 {M13 Ti S M2} 6 Q[M1,M2]. for all 11/11 E M U {0} and M2 E M with M1 S M2 then the sufficient condition (3.3) holds. This follows from the relation {M1STi 0, 00 {QSZV}= (l U {MlST1<7‘ZSM2}€gV‘- n:[-:-]+1(M1,M2)ESI} Hence {Q Q V} E {100 CW = .7V, and we have that g is strong Markov with respect to Q . Cl Chapter 4 Corner Markov and Reciprocal processes Definition 4.1 A random field E for d = 1 has the reciprocal Markov property if 9L9,” Ji— g[0,s]U[t,oo) Iggy}, for all s 2 t 2 0. The type of strong Markov property that a reciprocal Markov process has and the extension of this reciprocal Markov property are two topics that are treated in this chapter. 4.1 The corner Markov property One way that the reciprocal Markov process can be extended to R1 utilizes “corners”. Define a mg; Kg for a,b Z 0 by Kg={uER1:u1=a,0Su2Sb}U{uER1:u2=b,0Su1Sa}. 31 Figure 4-1 corner Kg xx}; :e-axis The following result puts one-dimensional conditions on a process that in turn yields a “corner Markov” type of property; that is 9pm] JJ- 9mm” for all a, b 2 0. Theorem 4.1 Assume the following: Ai : gimu] .LLgliu’oo)’ I Gin}, for all u 2 O i = 1,2 A3 : g[10,u1]'u-g[20,u2] I g[0,K3f]’ for all 111,112 2 0 Bi 2 Alglu‘) nMg[0,K312] = M923}, where MA E {f E L2(Q,.7, P) : f is A-measurable} for any sub-a-algebra A of .7 and Gig, E 0(X(,l,82) : s,- = u;,0 S 33-,- S 113..) i = 1,2 Ci 1 E(P(Blg[u,})lg[o,1{3§]) = E(P(BIQI0’K3?])|QEW}), i = 1,2 for all B E Q(u)} such that s _>_ u1,t 2 112 for all 111,112 _>_ 0. Note that A3 is a Cairoli-VValsh [CW78] (F4)-type of condition. Then g{0,1{3;2] .11. ngldf for all 111,112 _>_ 0. 33 Proof. Given any u = (u 1, 112) E 72.2,. , define the following regions. [1] = {(s,t) : .9 2 ul, 0_<_ t S 112} [2]={(3,t):0$s£u1,t_>_u2} [3] = {(s,t) : s 2 ul, t2 uz}. For notation convenience, let K = K312 . Part I Let B, 6 gm 2' e {1,2,3}. If i 5 {1,2}, then P(B..|g[Q,K]) = P(B,~|gf0,,,,]) using A3 and B 6 9,0; ..-1' Furthermore, P(Bilgf0,u,]) = fizz-19;“) by A.- and B,- 6 game). Thus P(B,-|Q[O,K]) = P(Bilgiu.~})' Otherwise, P(133 Igmm) = E(P(133l9f0,u,])|9[o,1c]) by 9mm 9 glam], WhiCh in tum equals E(P(Bglgiui})lg[o,1{]) using A; and B3 6 glance)‘ Thus P(Bz' lgl0,K]) = p(Bi|gfu,}), 2': 1,2 *(i) and HR": |9[0,K]) = E(P(33 lgiu,})|g[0,1{])- *(3i) Notation. Given asub- a -algebra A of f, let IPA denote the L262) -projection onto MA. Of course, IPAf = E(f I A) , for all f E L2(Q) . Part II Now we prove P(Bi Igm K]) = P(Bi IQiu ), and P(B3 I g[() 1(])— "’ P(B3 lgzili) for i— - 1, 2. First, let i E {1, 2}. Note that Pgmxfl %1,11; 6 Mgiu} HM Gm K] by *(z ) and that Mgiu}; nNIng1=MGin by 85. Moreover, . X — 1 . > .(Y '— 1 . X 615,323? ll all x6131}; } ll all 34 Mag; 2 Mg...” and X631}; llX-13.ll =Ilrg,,,,13.—IB..II = “mg...”— {as} 13." by *(z)- But llll’glmla- - 13.“ s “max? IIX - 13.“ and Pgrmlm E MgL‘f together imply lng[o,xllBa-13s" = Xenia.” [IX—13..“ . Hence P9{o,x]13s = .ul 139-313,; that is, P(Bi |9[0,K]) = P(Bi IQiZi - Next we consider B3 . Lemma 4.1 (IPg,0 K11130- JlUi} )"1133 = IPQ[0,K]133 for all n E IN . Proof. We shall prove this by induction. The proof for n = 1 follows from *(3)- Next, assume the result holds for 71; show the results holds for n + 1: Now . +1 _ . . (PQ[0,K]]Pgi,,i} )n 133 — (P9[0,K]]Pgiui} )(Pg[O,K] IPgiu‘})nlB3 = (PngqIPgiufl )(Pg[o,K]133) by assumption. Moreover, (P9[0,K1 P92” )(PngqlBs) = IP9[0,K]IPQEU,}Pg[o,K]IPGE,,,} 1133 = 1139‘me EDGE,” I) 93“,} Pgmx] 1133 by applying *(3) and Ci. One more application of *(3) allows us to conclude . l __ UPC/10.x] Pandlnl‘ 1133 _ 1PQ[0,K]1133 ‘ Thus P(B3 lgigf = P(B3 lgiui} n g[0,K]) = "Iiigo(n)g[o,l{]lpgiui} )nlBa using [3; , and nl—ilgo(][)9{o,xlmg‘ )nlBa' : II)glomlBa : P(B3 IPQIOJG) {Us} 35 by Lemma 4.1. Hence P(B3 lgiiff = P(B3 Ingm'Kl). Part III Next, we show P9{ox]135 = nglB; for i = 1,2,3. Now Pang; = PGK (IPgIO'Kl 18") and IPGK (PG(2_I,'_ (2— l -2|)"2B 2|)“21‘B) = Pg<2-Ii-2I)“§ 13, since MQiu'f Q ‘ngifif for i = 1,2. By applying Pg 13 =1Pg,0 K113 twice (by Part II), we get the desired result. Part IV In order to complete the proof, we must show that Part III holds for sets of the form B1 n B2, Bl 0 33 , 82 fl Bg , and B1 0 B2 n B3. We shall first consider B1 0 HQ . Observe that Pg“, K] 181082: 11) gm K]1 P gm K]1 = “MM. '1ngle = IPgK(13l -]PgK132) = PgK(1B,Pg[o,KllBgl = rgxmvgmug, .IPgW, 132)) = IPQK(]P9[0,1 K] 32',1P9[o K1131) = H’gKUPgtmmlBlnBz) = nglBinBz Note that the above implies by A3 by Part III by ngK 132 E MgK by MQK Q Nffiom by Pglo’K1132EA/[glo’1q by A3 by MQ'K Q MQ[0,K]' PgmxllB f: IPgKIB. f, for all f E Il/Ig[3_ l’ (4.1) 36 where i = 1, 2. Next we shall consider Bi 0 B3 for i E {1, 2} . Recall that PM 3-.- 133 = PMngq VMQallBa = 1139(qu 133 + P931133 — 1Pglmkllpgli] 133 + P9[51P9[0,K1P9u1133 " Pgmxllpfiqnjfimmlpfii] 133 + ' ° '(4°2) We shall show that each summand 18 in Mg”. First, IPg,0 K1133 Pg "2 1336 MC Jlil using Part II and Mgifif _C_ Mgm . From Part II, we see that E(f|g[0,](])= E(flgiii2), for all f 6 Mg”. Combining this and the relation Mgg; g Mg“, , we conclude that Palmlpgmlaa = P9i3§(PG[a]133) = P933133 6 Mam- Using the same techniques as above, we see that each term in (4.2) is an element of Mgm , and so we can conclude that IPQEJ—i ]133 E Mgli] . We will use this fact v“3—:‘ below. P910,“ 183183 = Pam (113i ngj;3_,, 133) by MC[O,K] C M913-.- _-1 and B; 6 9,075,341. Using (4.1) and PQEJTja-i1133 E Mgm , we have IPgKuB‘ . Pgllftls—illBs) = ngloJ‘lUB‘ 11393175341183). We eventually conclude that 1P9[o “13,033 = ngK 133133 by recognizing that Ang _C; Mga—i] and B; E Mgs-i [,_Oua -i] J[,_003 il Finally, we must prove ll’glo’ml 31"132nB3 = ngK 1 31n32n33, for each Bi 6 931' Lemma 4.2 IPG[o,K]1BmanBs 6 Mg, . 37 Proof. Recall that 1Pg[(,,,'.,131r133 = PMgIO'K] V ”0(2) 133133 = Pgwq 181083 + 1PgmlBlnB3 _ Pglwfl 1P9121 131033 + Pgm IP91”) Pgm 131033 _ Pfiom IP93] Pylon 11393111310133 + ' ' ' Thus IP9[0,K] 131032033 2 IPgl°vK11Pgilo~u(lBlnanBa) by Mg”, g Mglb,u1] : 1PQ[0,K](182 . Pgllo’ulllBlnBa) by B2 6 Mg[10,u1] : IPQ[0,K](1B2 ‘ 1P9[o,K]131rlBs) + Pg[o,x}(132 ' 1P93111310133) _ Pfiomaflz _ 11393.1(] 1139(2) 131033) + IPQquuBz ' PG[2}PQ[0,K11PQ[2]131033) ‘ Pgloxdlfiz ' Pg[0.K]H)g[2]Pg[0.K]]Pgl2]131n33) + ' " We must consider each term separately and show it to be in MgK . First, P93141132 -1Pg[0,,qlBlnBal = IPQK 132 ' IPQK 131033 using ll’(_;[0_,(,13lm33 E MQ[0,K] and Part III. Next, Pg[0,K](1132 PgmlmnBa) = Pg[o'K](]Pg[2]]-BlntnBa) = P9K(IP(J[2]1BlanBal by (4.1) and B2 6 gm . nglo'Klgm13ml}a E Nlfiox] and (4.1) together imply ng[0,K](lB2 ' IPQ'qu 1139(21131033) = 1P9[0,K]IP9[2]131033 ' 1P§/'[o,K]1132 = IPQK(B2 ' PG[0,K]1Pg[2]131nBS) : 1PQ'Klpgp]11310133 ' IPGK 132' 38 These same techniques will yield that every other term in the series is indeed in MQK . Hence IPgwm13,032.133 E MgK . This claim and the fact that min ”X — lBlnanBall 2 min “X — 181032033” xEMgK “(0,10 = “IPGWJQ — lBlnanBall will give us ngK 181032083 = P9[0,K]lBlflanBa' The last step is using the 1r — A Theorem to show that ngK 131(732r133 = Pg[0,KllBlnB2nB3 for all B; E 93} implies ngK 13 = P§[o,x]13’ all B E 9“),ch . .C E {A E f:]Pg[0,K]1A = ngK 1A} is a A-system as follows: i) lpglo K119 = 1 = ngK 19 implies that Q E C. ii) Let A E .C . Then Pglo.K]1Ac=1_]Pg 1A=1"']PQK1A=IPQK1A¢ [0.Kl since A E C . iii) Let {An} g .C be pairwise disjoint. Then PQ[O,K]1UA.. = 1P(1uAn lg[0,K]) = “2,3131” l9[0,1{]) = in: 11’ (1A.. lg[0,K])An E 5 by pairwise disjointness. Furthermore, in: 1130/1" |9[0,1<]) = 2;“)(1/1n ng) = H)(;1An mm = IPQK 1UA,.- Therefore U An E .C , and we conclude that .C is a A-system. n Now apply the 7r — /\ theorem to get that ngK 13 = 1P9,0 K113 , for all B E 93““. Therefore 9mm .11. Q(WIQK. 39 The proof is now complete. D Of particular interest are random domains Q which take on the form Q = [0,Krrf] for 11,12 nonnegative random variables. That is, Q takes values in the class {[0,1{3} : a, b > 0}. Note that K3 is a membrane if a, b > O. T It is desirable to determine what (2.6) means when Q = [0,1133] . Lemma 4.2 (2.6) holds if and only if {qu 3 K3} 6 110“], for all a, b > 0. (4.3) Proof. Certainly, (2.6) implies (4.3), since [0,K2] E T when a, b > 0. Moreover, (2.6) follows from (4.3) upon noting that, given V E T and e > 0 , {Q <_: V} = n U {Kg2 3 K2} 6 fve. n=[%l+1 a,beQ+,[o,Kg]gV% Since flog fvc, we are done. C] When we write (4.3) as {T1_<_ (1,7'2 S b}fi0,1(g], VG, b > 0, we are reminded of a stopping time-type of condition. Theorem 4.2 Let 6 be a Markov random field and 1'1 and 72 be random variables which satisfy 0 S 71(w) S Q(u)), for allw E Q. If {K122 3 K3} E 310,373] , for all a,b > 0, then 6 is strong Markov with respect to the random domain Q = [0,K7Tf] . 40 The random analogue to Theorem 4.1 also arrives at a strong corner Markov property and is stated below. Theorem 4.3. Let 71,112 be covntably-valved stopping times, and let K E K? for notational convenience. Let f; = 0(A n {n' = a} : a 6 mm), A e afofll). I; = 0(A n {n' = a} : a e ri(Q), A e gfoflp, and f..- = o(A n {n' = a} : a e ri(f2), gird) for i = 1,2. Also define f1? = o_ b)) AiszJLFT'Elf—r, fori=1,2 A3 :13; _lLfT—z' If]? Then Elf-Mfr Proof. The techniques used in the proof of theorem 4.1 are the techniques used to prove this theorem with the obvious substitutions. f; , El , and 7'},- play the roles that gf'o’ulgf'um) , and GEM} played in the proof of Theorem 4.1, respectively. We use fR,.7-7{' , and .7: K in this proof for the 110,163] , W, and $3 used in the proof of Theorem 4.1, respectively. Also, .73 replaces GB] and M71.- replaces MQiL‘f in this proof. D 4.2 A Martingale Approach The following question arises naturally upon considering reciprocal processes (dzl): What kind of strong Markov property, if any, does a given reciprocal process have? Pasha [Pa882] showed that every Gaussian reciprocal process 6 on a compact interval [a, b] can be expressed as Q=K+$§+E® where Yt is Markov with trivial tails, A and B are real functions on [a, b] , and 460,65) is independent of 0(1’2) for each t E [a,b]. If Y1 i 0 for every t and if 6 is continuous in quadratic mean, then Y; = ¢Ut for all t, where (I): is a real function on [a,b] and U is a martingale. One can look at a stochastic 42 differential equation for the process Z; = (1’1,Ag€a,B¢{b)T. Zt satisfies the stochastic differential equation ' (I’l‘l’z’l 0 0 in 0 0 dU¢ dZt= 0 AQA;1 0 tht+[0 0 0] [ 0] 0 0 Bfo1 O 0 0 0 under appropriate differentiability assumptions on ,A , and B . By the nature of this differential equation, one needs techniques from [SV79] to study the strong Markov properties of its solution. In the special case of Agfia +B¢€b = 0 for every t (that is, é is independent of its boundaries), one can use techniques exhibited in [Str87] and [SV79] to consider 5 as a solution of a stochastic differential equation, and then apply Theorem 6.6.2 of [SV79] to get a strong Markov pr0perty on g. Because the equation involved in the general case is a singular differential equation, one must define the strong Markov property delicately. This work is currently under investigation. Bibliography [own] [D V88] [00044] [Dud82/ [Evs77] [Evs82/ [Evs88/ [KM74/ Bibliography R. Cairoli and J. B. Walsh. Re’gions d’arrét, localisations et pro- longements de martingales. Z. Wahrsch. Verv. Gebiete 44:279-806, 1978. D. J. Daley and D. Vere-Jones. An Introduction to the Theory of point processes. Springer- Verlag, Berlin-Heidelberg, New York, 1988. J. L. Doob. The elementary Gaussian processes. Annals of Mathe- matical Statistics, 15:229-282, 1944. R. M. Dudley. Empirical and Poisson processes on classes of sets or functions too large for central limit theorems. Z. Wahrsch. Verw. Gebiete 61: 355-368, 1982. I. V. Evstigneev. Markov times for random fields. Theory of Proba- bility and Applications, 22:565-569, 1977. I. V. Evstigneev. Extremal problems and the strict Markov property of stochastic fields. Russian Mathematical Surveys, 87, no. .5 (1982). I. V. Evstigneev. Stochastic extremal problems and the strong Markov property of random fields. Russian Mathematical Surveys, 48:1-49, 1988. G. Kallianpur and V. Mandrekar. The Markov property for general- ized Gaussian random fields. Ann. Inst. Fourier, 24, no. 2 (1974). 43 [Kun79/ [Lev48] [Man88] [McK68] [MN9 o] /Mol71/ [Nel78/ [Pas82] [Fir/1] [Roz82/ [Sim74/ [S V79] 44 H. Kunch, Gaussian Markov random fields. J. Fac. Sci. Univ. Tokyo Sect. IA Math. 26 (1979). Paul Lévy. Processus stochastiques et mouvement brownien. Gauthier- Villars, Paris, 1948. V. Mandrekar. Markov properties for random fields. Probabilistic Analysis and Related Topics, pages 161-198, Academic Press, 1988. H. P. McKean. Brownian motion with a several dimensional time. Theory of Probability and Applications, 1968, 8, 885-854 Ely Merzbach and David Nualart. Markov properties for point pro- cesses on the plane. Annals of Probability, 18:842-858, 1990. Molchan, G. M. Characterization of Gaussian fields with Markov prop- erty, Soviet Math. Dokl. 12 (1971). E. Nelson. Construction of quantum fields from Markov fields. J. Funct. Anal., 12: 97-112, 1978. E. Pasha. 0n the structure of germ-field Markov processes on finite intervals. Thesis, Michigan State University, 1982. L. D. Pitt. A Markov property for Gaussian processes with a multi- dimensional parameter. Arch. Rat. Mech. Anal., 48 (1971). Y. A. Rozanov. Markov Random Fields. Springer Verlag, Berlin- Heidelberg-New York, 1982. B. Simon. The P(<,o)2 model of Euclidean (quantum) field theory. Princeton Series in Physics, Princeton Univ. Press, Princeton, NJ, 1974. D. W. Stroock and S.R.S. Varadhan. Multidimensional Difi'usion Pro- cesses. Springer-Verlag, New York, 1979 45 [Str87/ D. W. Stroock. Lectures on Stochastic Analysis. Cambridge Univer~ sity Press, Cambridge, 1987 f I u khan: .. r “khan...” , 1.; ,.\ .4- n. ".11...- .I» a...” n— 1 NICHIGQN STQT TE 1:; lll ll W“ l‘” l .. .. , , 1. all f: llllllllll - _, _. ,7 3129300 i' #1317.)ou . v1. _4 ‘21,. Jigs-ax,” :1 ~. . . I—‘.l.;')u)il . 1r. .. I - -.,.\:'.u. 1 .wmyi. _k" :l .- ,.n,... . n . .. “an“... 5 . . 9. i _ >1 4 w! U..vlh.l‘l)1'r .-. . .u. .,.. ,5, w.” -.: at). "" .... .-.—. in 1‘ a u. a u. rub..- - .a_ ~ ‘11.; Ina” . w, . .,, 1 .m., ,-»...r.. \w.‘ “mm? W “TV". . . _‘,,~.., 1’ .> . .. ., 3 ‘ 'V _ ;\'\'! _ ,..’ , > ‘ ; \ .1. .4 :::’,.,..,.,I..,. . w. ..,..n .“wa. 3w”: . ,.JW. . .1M..w.lf . . I n. \m' u H. La, “wanna“ “1 c. ., V, r H. w, , .