The following program calculates the minimum point of a multi-variable function using the Conjugate Gradient (Fletcher-Reeves) method.
The program prompts you to enter for each variable:
1. The coordinates of the minimum value.
2. The minimum function value.
3. The maximum number of iterations
The program displays intermediate results to show the iteration progress and also the following final results:
1. The coordinates of the minimum value.
2. The minimum function value.
3. The number of iterations
The current code finds the minimum for the following function:
f(x1,x2) = x1 - x2 + 2 * x1 ^ 2 + 2 * x1 * x2 + x2 ^ 2
Here is a sample session. The first figure shows the input.:
The second figure shows the output:
Here is the BASIC listing:
10 ! Conjugate Gradient (Fletcher-Reeves) method 20 ! 30 ! USES JPCROM TO SUPPORT ADVANCED LOOPS AND 40 ! DECISION-MAKING CONSTRUCTS 50 ! 60 N = 2 70 DIM X(N), D(N), D0(N) 80 ! Input data 90 DISP "Conjugate Gradient Optimization" 100 INPUT "Enter function tolerance value? "; E 110 INPUT "Enter maximum number of iterations? "; I9 120 For I = 1 to N 130 DISP "Enter guess for X(";I;")"; 140 INPUT X(I) 150 Next I 160 INPUT "Show intermediate values? (Y/N) "; A$ 170 If UPRC$(A$[1,1]) = "Y" Then 180 For I = 1 To N 190 DISP "X(";I;")",X(I);" "; 200 Next I 210 DISP 220 End If 230 ! 240 ! calculate function value at initial point 250 CALL MyFx(X, N, L) 260 ! 270 CALL GetGrads(X, N, D, D8) 280 ! 290 L8 = 0 300 CALL LSrch(X, N, L8, D, 0.1, 0.000001, R0) 310 If R0 = 1 Then 320 For I = 1 To N 330 X(I) = X(I) + L8 * D(I) 340 Next I 350 Else 360 DISP "Failed linear search" 370 STOP 380 End If 390 ! 400 I0 = 1 410 LOOP 420 I0 = I0 + 1 430 If I0 > I9 Then 440 DISP "Reached maximum iterations limit" 450 LEAVE 460 End If 470 D9 = D8 480 For I = 1 To N 490 D0(I) = D(I) ! save old gradient 500 Next I 510 CALL GetGrads(X, N, D, D8) 520 For I = 1 To N 530 D(I) = (D8 / D9) ^ 2 * D0(I) - D(I) 540 Next I 550 If D8 <= E Then 560 DISP "Gradient norm meets convergence criteria" 570 LEAVE 580 End If 590 L8 = 1 600 CALL LSrch(X, N, L8, D, 0.1, 0.000001, R0) 610 If R0 = 1 Then 620 For I = 1 To N 630 X(I) = X(I) + L8 * D(I) 640 Next I 650 CALL MyFx(X, N, F) 660 If Abs(F - L) < E Then 670 DISP "Successive function values meet convergence criteria" 680 LEAVE 690 Else 700 L = F 710 End If 720 Else 730 DISP "Failed linear search", 740 LEAVE 750 End If 760 If UPRC$(A$[1,1]) = "Y" Then 770 For I = 1 To N 780 DISP "X(";I;")";X(I)," "; 790 Next I 800 DISP 810 End If 820 END LOOP 830 ! 840 DISP 850 DISP "**********FINAL RESULTS************" 860 DISP "Optimum at:" 870 For I = 1 To N 880 DISP "X(";I;")=";X(I) @ PAUSE 890 Next I 900 DISP "Function value ="; L @ PAUSE 910 DISP "Number of iterations = ";I0 920 ! 930 !---------------- Declare SUBs ------------------- 940 ! 950 SUB MyFx(X(), N, R) 960 ! R = 100 * (X(1) ^ 2 - X(2)) ^ 2 + (1 - X(1)) ^ 2 970 R = X(1) - X(2) + 2 * X(1) ^ 2 + 2 * X(1) * X(2) + X(2) ^ 2 980 End SUB 990 ! 1000 SUB MyFxEx(N, X(), D(), L, R) 1010 DIM X0(N) 1020 For I = 1 To N 1030 X0(I) = X(I) + L * D(I) 1040 Next I 1050 CALL MyFx(X0, N, R) 1060 End SUB 1070 ! 1080 Sub GetGrads(X(), N, D(), D9) 1090 D9 = 0 1100 For I = 1 To N 1110 X0 = X(I) 1120 H = 0.01 * (1 + ABS(X0)) 1130 X(I) = X0 + H 1140 CALL MyFx(X, N, F1) 1150 X(I) = X0 - H 1160 CALL MyFx(X, N, F2) 1170 X(I) = X0 1180 D(I) = (F1 - F2) / 2 / H 1190 D9 = D9 + D(I) ^ 2 1200 Next I 1210 D9 = Sqr(D9) 1220 End Sub 1230 ! 1240 SUB LSrch(X(), N, L, D(), I9, I0, R) 1250 CALL MyFxEx(N, X, D, L, F1) 1260 REPEAT 1270 CALL MyFxEx(N, X, D, L + I9, F2) 1280 If F2 < F1 Then 1290 F1 = F2 1300 L = L + I9 1310 Else 1320 CALL MyFxEx(N, X, D, L - I9, F2) 1330 If F2 < F1 Then 1340 F1 = F2 1350 L = L - I9 1360 Else 1370 ! reduce search step size 1380 I9 = I9 / 10 1390 End If 1400 End If 1410 UNTIL I9 < I0 1420 R = 1 1430 End SUBThe above listing uses the JPCROM to employ more structured constructs for loops and decision-making. The above listing uses the REPEAT-UNTIL and LOOP-ENDLOOP loops and also the IF-THEN-ELSE-ENDIF enhanced version of the basic IF statement. These program flow control statements limits or eliminates using the GOTO statements and make the programs easier to read and update. Here is the version of the listing that is indented and void of line numbers. I am including it here since it is easier to read than the above listing although it will not run on either the HP-71B handheld computer or the EMU71 emulator..
! Conjugate Gradient (Fletcher-Reeves) method ! ! USES JPCROM TO SUPPORT ADVANCED LOOPS AND ! DECISION-MAKING CONSTRUCTS ! N = 2 DIM X(N), D(N), D0(N) ! Input data DISP "Conjugate Gradient Optimization" INPUT "Enter function tolerance value? "; E INPUT "Enter maximum number of iterations? "; I9 For I = 1 to N DISP "Enter guess for X(";I;")"; INPUT X(I) Next I INPUT "Show intermediate values? (Y/N) "; A$ If UPRC$(A$[1,1]) = "Y" Then For I = 1 To N DISP "X(";I;")",X(I);" "; Next I DISP End If ! ! calculate function value at initial point CALL MyFx(X, N, L) ! CALL GetGrads(X, N, D, D8) ! L8 = 0 CALL LSrch(X, N, L8, D, 0.1, 0.000001, R0) If R0 = 1 Then For I = 1 To N X(I) = X(I) + L8 * D(I) Next I Else DISP "Failed linear search" STOP End If ! I0 = 1 LOOP I0 = I0 + 1 If I0 > I9 Then DISP "Reached maximum iterations limit" LEAVE End If D9 = D8 For I = 1 To N D0(I) = D(I) ! save old gradient Next I CALL GetGrads(X, N, D, D8) For I = 1 To N D(I) = (D8 / D9) ^ 2 * D0(I) - D(I) Next I If D8 <= E Then DISP "Gradient norm meets convergence criteria" LEAVE End If L8 = 1 CALL LSrch(X, N, L8, D, 0.1, 0.000001, R0) If R0 = 1 Then For I = 1 To N X(I) = X(I) + L8 * D(I) Next I CALL MyFx(X, N, F) If Abs(F - L) < E Then DISP "Successive function values meet convergence criteria" LEAVE Else L = F End If Else DISP "Failed linear search", LEAVE End If If UPRC$(A$[1,1]) = "Y" Then For I = 1 To N DISP "X(";I;")",X(I); Next I DISP End If END LOOP ! DISP DISP "**********FINAL RESULTS************" DISP "Optimum at:" For I = 1 To N DISP "X(";I;")=";X(I) @ PAUSE Next I DISP "Function value ="; L @ PAUSE DISP "Number of iterations = ";I0 ! !---------------- Declare SUBs ------------------- ! SUB MyFx(X(), N, R) ! R = 100 * (X(1) ^ 2 - X(2)) ^ 2 + (1 - X(1)) ^ 2 R = X(1) - X(2) + 2 * X(1) ^ 2 + 2 * X(1) * X(2) + X(2) ^ 2 End SUB ! SUB MyFxEx(N, X(), D(), L, R) DIM X0(N) For I = 1 To N X0(I) = X(I) + L * D(I) Next I CALL MyFx(X0, N, R) End SUB ! Sub GetGrads(X(), N, D(), D9) D9 = 0 For I = 1 To N X0 = X(I) H = 0.01 * (1 + ABS(X0)) X(I) = X0 + H CALL MyFx(X, N, F1) X(I) = X0 - H CALL MyFx(X, N, F2) X(I) = X0 D(I) = (F1 - F2) / 2 / H D9 = D9 + D(I) ^ 2 Next I D9 = Sqr(D9) End Sub ! SUB LSrch(X(), N, L, D(), I9, I0, R) CALL MyFxEx(N, X, D, L, F1) REPEAT CALL MyFxEx(N, X, D, L + D, F2) If F2 < F1 Then F1 = F2 L = L + I9 Else CALL MyFxEx(N, X, D, L - D, F2) If F2 < F1 Then F1 = F2 L = L - I9 Else ! reduce search step size I9 = I9 / 10 End If End If UNTIL I9 < I0 R = 1 End SUB
Here is a version of the first listing that DOES NOT rely on the JPCROM. This version should run on the basic HP-71B with no Math or JPCROM modules.
10 ! Conjugate Gradient (Fletcher-Reeves) method 50 ! 60 N = 2 70 DIM X(N), D(N), D0(N) 80 ! Input data 90 DISP "Conjugate Gradient Optimization" 100 INPUT "Enter function tolerance value? "; E 110 INPUT "Enter maximum number of iterations? "; I9 120 For I = 1 to N 130 DISP "Enter guess for X(";I;")"; 140 INPUT X(I) 150 Next I 160 INPUT "Show intermediate values? (Y/N) "; A$ 170 If UPRC$(A$[1,1]) = "Y" Then 180 For I = 1 To N 190 DISP "X(";I;")",X(I);" "; 200 Next I 210 DISP 220 !End If 230 ! 240 ! calculate function value at initial point 250 CALL MyFx(X, N, L) 260 ! 270 CALL GetGrads(X, N, D, D8) 280 ! 290 L8 = 0 300 CALL LSrch(X, N, L8, D, 0.1, 0.000001, R0) 310 If R0 = 0 Then 350 320 For I = 1 To N 330 X(I) = X(I) + L8 * D(I) 340 Next I 345 GOTO 380 350 !Else 360 DISP "Failed linear search" 370 STOP 380 !End If 390 ! 400 I0 = 1 410 !LOOP 420 I0 = I0 + 1 430 If I0 <= I9 Then 460 440 DISP "Reached maximum iterations limit" 450 GOTO 840 460 !End If 470 D9 = D8 480 For I = 1 To N 490 D0(I) = D(I) ! save old gradient 500 Next I 510 CALL GetGrads(X, N, D, D8) 520 For I = 1 To N 530 D(I) = (D8 / D9) ^ 2 * D0(I) - D(I) 540 Next I 550 If D8 > E Then 580 560 DISP "Gradient norm meets convergence criteria" 570 GOTO 840 580 !End If 590 L8 = 1 600 CALL LSrch(X, N, L8, D, 0.1, 0.000001, R0) 610 If R0 = 0 Then 720 620 For I = 1 To N 630 X(I) = X(I) + L8 * D(I) 640 Next I 650 CALL MyFx(X, N, F) 660 If Abs(F - L) >= E Then 690 670 DISP "Successive function values meet convergence criteria" 680 GOTO 840 690 !Else 700 L = F 710 !End If 715 GOTO 750 720 !Else 730 DISP "Failed linear search", 740 GOTO 840 750 !End If 760 If UPRC$(A$[1,1]) <> "Y" Then 810 770 For I = 1 To N 780 DISP "X(";I;")";X(I)," "; 790 Next I 800 DISP 810 !End If 820 GOTO 410 ! END LOOP 830 ! 840 DISP 850 DISP "**********FINAL RESULTS************" 860 DISP "Optimum at:" 870 For I = 1 To N 880 DISP "X(";I;")=";X(I) @ PAUSE 890 Next I 900 DISP "Function value ="; L @ PAUSE 910 DISP "Number of iterations = ";I0 920 ! 930 !---------------- Declare SUBs ------------------- 940 ! 950 SUB MyFx(X(), N, R) 960 ! R = 100 * (X(1) ^ 2 - X(2)) ^ 2 + (1 - X(1)) ^ 2 970 R = X(1) - X(2) + 2 * X(1) ^ 2 + 2 * X(1) * X(2) + X(2) ^ 2 980 End SUB 990 ! 1000 SUB MyFxEx(N, X(), D(), L, R) 1010 DIM X0(N) 1020 For I = 1 To N 1030 X0(I) = X(I) + L * D(I) 1040 Next I 1050 CALL MyFx(X0, N, R) 1060 End SUB 1070 ! 1080 Sub GetGrads(X(), N, D(), D9) 1090 D9 = 0 1100 For I = 1 To N 1110 X0 = X(I) 1120 H = 0.01 * (1 + ABS(X0)) 1130 X(I) = X0 + H 1140 CALL MyFx(X, N, F1) 1150 X(I) = X0 - H 1160 CALL MyFx(X, N, F2) 1170 X(I) = X0 1180 D(I) = (F1 - F2) / 2 / H 1190 D9 = D9 + D(I) ^ 2 1200 Next I 1210 D9 = Sqr(D9) 1220 End Sub 1230 ! 1240 SUB LSrch(X(), N, L, D(), I9, I0, R) 1250 CALL MyFxEx(N, X, D, L, F1) 1260 !REPEAT 1270 CALL MyFxEx(N, X, D, L + I9, F2) 1280 If F2 >= F1 Then 1310 1290 F1 = F2 1300 L = L + I9 1305 GOTO 1400 1310 !Else 1320 CALL MyFxEx(N, X, D, L - I9, F2) 1330 If F2 >= F1 Then 1360 1340 F1 = F2 1350 L = L - I9 1355 GOTO 1390 1360 !Else 1370 ! reduce search step size 1380 I9 = I9 / 10 1390 !End If 1400 !End If 1410 IF I9 >= I0 THEN 1260 1420 R = 1 1430 End SUB
Copyright (c) Namir Shammas. All rights reserved.