Patch Name: PHSS_29646 Patch Description: s700_800 11.00 OV ITO6.0X Solaris Agent Patch A.06.15.1 Creation Date: 03/08/27 Post Date: 03/08/29 Hardware Platforms - OS Releases: s700: 11.00 s800: 11.00 Products: OpenView IT/Operations 6.0 Filesets: OVOPC-CLT.OVOPC-SOL-CLT,fr=A.06.00,fa=HP-UX_B.11.00_32/64,v=HP Automatic Reboot?: No Status: General Release Critical: Yes PHSS_29646: ABORT Category Tags: defect_repair general_release critical halts_system Path Name: /hp-ux_patches/s700_800/11.X/PHSS_29646 Symptoms: PHSS_29646: - SR: H555010800 DCE daemon stops or hangs on the system when it gets invalid data. PHSS_28863: - SR: H555007666 mailq_l.sh does not count mails correctly if a sendmail version 10 or above is used. - SR: H555008275 Message Agent can hang for no apparent reason and stop sending all messages to the Management Server regardless of its state. - SR: H555008529 If a process dies immediately after being started by the Control Agent, it is possible that OpC30-1094 messages start appearing in the error logfile. - SR: H555008553 - itochecker was able to collect only information from the management server. There was no possibility to collect information from the managed nodes. Checking the kernel parameters did not return the correct values by itochecker. No information about opcerror was gathered by itochecker. - SR: H555008582 LANG value in start-up script is not set correctly for Tru64, AIX, Solaris and Linux agent - SR: H555008602 If setting OPC_RPC_ONLY to TRUE in opcinfo, after a while the message agent core dumps. - SR: H555008631 Customer receives a lot of OpC20-61 and OpC20-63 messages in the error logfile when using NCS agents. - SR: H555010228 opcctla core dumps after configuring SunMC 3.0 integration - SR: 8606180583 When the VPO agent was started manually from an MC/SG shared volume, the agent was killed upon package stop. This was because the agent used this volume as the current directory. Now the agent always starts in /tmp. This also has the side effect that any core file for the agent is written into /tmp. - SR: 8606187183 After deploy/undeploy of opcmsg policies/templates the suppressing times are lost. Messages that should be suppressed after a deploy/undeploy of policies/templates are shown. - SR: 8606232431 VPO tries to resolve node names that only contain blanks because of typo in template definition or variable assignment. This leads to a lot of unnecessary DNS traffic. - SR: 8606262299 The logfile encapsulator reports that the File to be executed for preprocessing of a logfile template failed. This error occurs randomly and only from time to time. You will get an error message similar to the following: Command 'opcfwtmp /tmp/wtmp.stat /var/adm/wtmp /tmp/wtmp.out' configured in source 'Logins (10.x/11.x HP-UX)' returns 1. Ignoring this logfile. (OpC30-107) - SR: 8606282247 Logfile Encapsulator does not perform variable replacement for all Message Defaults fields. - SR: B555008674 The opcagt and the opcragt commands have a new option '-version'. In both man pages, this was not documented. The new option was not part of the usage strings of opcagt and opcragt. - SR: B555010955 Even if you used opcswitchuser.sh to specify a non-root user which should run the ITO agent, it will still be started as user root after a system reboot. - SR: B555014215 The port should be configurable where opctrapi listens for incoming traps. - SR: B555014245 Traps of size greater than 5 Kb not handled properly by opctrapi if local trap interception is used. - SR: B555014574 opcagt -start/-stop/-status doesn't work correctly, if the currently running agent can't be reached over RPC. - SR: B555014591 When the OPC_INT_MSG_FLT is set to TRUE then the filtered message is received corrupted on the server in a Japanese environment. - SR: B555014596 Including fixes from the ECS runtime engine with PHSS_26909 and equivalent. - SR: B555014715 The Control Agent slowly grows in memory usage. - SR: B555014851 opcmsga sends the same message operation (e.g. an acknowledge request created by opcmack(1) ) again and again if the related message is not in the cache and one of the target managers can not be reached. - SR: B555014942 The opcle process loops if a logfile is removed while it is read. - SR: B555015331 The monitor agent, opcmona, may report wrong results of executed monitor scripts or programs when using many 'advanced monitors' such as OVPERF. In some of these cases opcmona might even abort. - SR: B555015626 opcif_read() doesn't return data if the signal pipe is empty, but there is still more data in the queue file. This can for example happen if the maximum pipe size of 8192 bytes was reached and therefore no more signal bytes could be written into the signal pipe. - SR: B555015758 opcmsgi aborts if one of the set attributes has an unmatched '<'. - SR: R555008715 If a message key contains characters like [] or <>, message correlation does not work. PHSS_27298: - SR: H555003664 option "-" of dmesg is not supported any more - SR: H555006719 If the agent is running as a non-root user, when the management server processes are restarted, the agent has to be restarted, else messages are buffered. - SR: H555006934 After agent installation internal messages are sent with wrong codeset. - SR: H555008352 ITO agent installation check on Solaris 2.9 - SR: 8606213476 The distribution to nodes may hang or fail. This is more likely to happen while distributing to Windows NT/2000 nodes rather than on UNIX nodes. On Windows NT/Windows 2000 nodes the control agent may produces a Dr. Watson error. - SR: 8606217814 If customer calls agent init script manually from non posix shell it will fail.So shell definition need to be added. - SR: 8606222554 Certain policies in VPW do not work as expected, for example: VP_WIN-WINS-FwdAllInformation VP_WIN-WINS-FwdAllWarnError VP_WIN-DHCPCl_FwdAllInfo VP_WIN-DHCPCl_FwdAllWarnError This problem can also occur for VPO during condition matching. Matching the application and object attributes is now case sensitive. For example, a message with application "TEST" is matched but application "tEST" is unmatched. - SR: 8606227840 Variables in the template default message key are not resolved for unmatched messages. - SR: 8606233602 If using a pattern like '<*.prefix>ERR<*.suffix>', the prefix variable will get assigned a wrong text if it should be empty. - SR: 8606242614 Messages are incorrectly suppressed by the logfile encapsulator if "suppress identical output messages" is specified and the messages differ only in the values of <$LOGFILE> and/or <$LOGPATH>. - SR: 8606244523 When using the syntax <`script`> in the logfile template and the script returns the same logfile name twice, the opcle will abort. - SR: B555007980 Local automatic actions are started immediately, even though the agent MSI is enabled in divert mode and the Immediate Local Automatic Action box is not checked. - SR: B555008220 The <$MSG_TIME_CREATED> variable is not substituted in the message template. - SR: B555008838 The event correlation engine creates a "Time cannot go backwards" error if the system is very busy. - SR: B555009745 The template default of the object field of a monitor template is not used. - SR: B555010620 Some messages are missing in the Japanese message catalogue. You get a Cannot generate message error. - SR: B555010966 A message key relation containing <*> does not always match message keys correctly. This results in messages not being acknowledged when they should. - SR: B555011184 opcagt fails to start opcctla if it is started as ./opcagt and /opt/OV/bin/OpC is not in the search PATH - SR: B555011594 The original message text of a logfile encapsulator message is wrong if <$LOGPATH> or <$LOGFILE> is used. - SR: B555011638 The pattern matching cannot match the new line character(s) of multiline messages. - SR: B555011979 The pattern matching hangs if only single byte Japanese HANKAKU KANA characters are used. - SR: B555011990 The ECS event log (ecevilg) has an invalid time difference to the next message which can cause the ECS simulator to hang or appear to hang when loading an event log file with such values. - SR: B555012210 ECS circuit using reset on unless node causes opceca to abort. - SR: B555012929 If you run opcdista from commandline, you don't get any useful messages, only the internal status letters. For supportability, it would be better to have some explicit status and error reporting. - SR: B555013371 Sometimes the new scheduled action template configuration is not loaded after a distribution. Instead, the old scheduled actions are still started. - SR: B555013435 The message agent opcmsga hangs unpredictably. This is more likely to happen on systems with a very high ICMP traffic. - SR: B555013495 In Japanese environments programs using the agent APIs can fail with errors on invalid or incompatible codesets. - SR: B555013620 Support for pmd's "u" option needed in opctrapi: use the UDP packet's address as source of the trap. - SR: B555013719 Message agent doesn't stop message buffering when the management server is available again after a network outage, fixed DNS problem or similar. This can happen when the agent restarts/the machine reboots during the network problem occured. - SR: B555013794 The current directory should be removed from the agent's environment PATH. That way only files can be executed that are either specified with the full path or the path is explicitly specified in the PATH. - SR: B555013891 In MoM environments, opcmsga does not return action responses to SECONDARY managers, if their name is not resolveable. - SR: B555014093 opcmona may crash (UNIX) or doesn't process all SCHEDULE templates (Windows) when using SCHEDULE templates. - SR: B555014132 During a distribution the agent may report an error like: ITO responsible manager configuration. (OpC30-1203) Cannot open file \usr\OV\tmp\OpC\cfgchg. System Error Number: 13 (d) - The data is invalid. (OpC20-63) - SR: B553000162 After opcagt -stop, opcagt -status does tell that the control agent does not run although it is running and sometimes you get following error in the message browser: ouput of kill -0 differs from internal pids-table for index (OpC30-1094) PHSS_24641: - SR: H555005793 after A.06.06 opcctla might core dump during distribution - SR: H555005720 vcs_monitor.sh script is missing from the NCS comm package - SR: H555005837 Japanese catalogs are missing in A.06.06 patch - SR: H555004515 Subagent cannot be stopped on Solaris 8 - SR: B555010879 opctrapi aborts during template distribution if conditions with the 'Suppress Identical Output Messages' features are used. - SR: B555010899 opcdista requests distribution data from a wrong manager if there is a secondary manager with the same short hostname than the appropriate primary manager. - SR: B555010948 Nested alternatives were not handled correctly in the pattern matching algorithm, e.g. the pattern '[a|b]c|d' was handled like '[a|b|d]c'. - SR: B555010980 Traps without a SNMP variable are not matched because server patch adds an extra attribute to the template. PHSS_24126: - SR: 8606180891 The template default for the service name is not used. - SR: 8606181988 The event interceptor doesn't forward on "forward unmatched" if a "supress unmatched" condition is used in a second template - SR: B555010341 Agent sometimes does not start automatically after reboot while manual start works fine. - SR: 8606146160 Veritas Cluster was not supported on VPO for Sun Solaris PHSS_23825: - The VPO A.06.03 patches for HP-UX and Solaris do not work as expected in firewall environments: While server port restrictions are still regarded, client-side port restrictions are ignored. - The event correlation process opcecm might crash after processing several annotation nodes - HPlwdce Kill(shutdown) scripts are missing. PHSS_22886: - Changes were required for the security add-on product VantagePoint Advanced Security. - The IT/O Agent API call opcagtmsg_send() might leak memory if the opcmsgi isn't running. - The opctrapi might abort with OpC30-104 - disk_mon.sh returns invalid values if the bdf command returns more than one line output for a filesystem (e.g. if the filesystem name exceeds its column width) - ITO Agent might hang during ITA synchronisation when transferring bigger files (>0.5 MB) from several nodes to the ITO server at the same time - swap_util monitoring does not work on Solaris 8 - in Japanese environment. - Problems with allowed port range. - In case of an error the ITO Agent might hang during startup without terminating PHSS_22256: - Agent needs better handling for firewall environments. - When executing large numbers of autoactions, some of them were staying in 'running' state. - opctrapi aborts after getting traps with unresolvable IP address. - The handling of '\' was different in the pattern definition and the "matching pattern". - the opcmsga reported Message Receiver service not registered even if the connection to the server was ok - Running a VBScript-tool on a unix node gave a strange error message - if buffer file size limitation is enabled the agent may discard low-severity messages even if there is still space in the buffer file - swap_util template did not work on Solaris 8 in Japanese environment Defect Description: PHSS_29646: - SR: H555010800 DCE daemon fails when it receives invalid data. The code has been fixed to ignore such packets PHSS_28863: - SR: H555007666 The output of sendmail 10.75 been changed. So the regular expression need to be modified. - SR: H555008275 Signal handler for SIGIO was installed before the socket on which we receive ICMP replies was set to non-blocking mode. An unsolicited SIGIO would trigger the signal handler which would wait indefinitely on the socket for data, which would never arrive. Since NCS agent is single-threaded, all communication would stop. The fix is in setting the non-blocking mode before installing the signal handler, so it would not wait forever. - SR: H555008529 This is a timing issue, where internal structures are not updated by the signal handler in time for proper values to be written in the PIDS file. An additional check for process presence has been implemented before writing the PIDS file. - SR: H555008553 Resolution: - itochecker_agt and its configuration file itochecker_agt.conf were introduced. Checking the kernel parameters now returns the correct values. An additional opction (8) was added which gets opcerror file on management server. - SR: H555008582 Agent has used system value for LANG setting in start-up script and problem was when this setting was not the same as LANG setting in database. Now installation always checks node settings in database. - SR: H555008602 When using OPC_RPC_ONLY, ICMP handling is not initialized, but the message agent will call opc_pb_ping_reset() after a successful server checkalive cycle. This causes an invalid (NULL) pointer to be dereferenced and causes a core dump. opc_pb_ping_reset() now has a check to see if ICMP handling has been initialized and if not, immediately returns from the function. - SR: H555008631 NCS agent open() and stat() calls did not handle EINTR, so a check/loop was implemented to handle it. - SR: H555010228 Buffer of subagent's name was fixed size and there was a problem with larger names which caused core dumping. - SR: 8606187183 The opcmsg interceptor restarts after a deploy/undeploy of policies/templates. During this process all the policy/template information is cleaned and read again from a temporary file. Because suppressing times are not stored in this temporary file, this times are lost. Now the suppressing times are taken over to the new data. - SR: 8606232431 VPO now ignores node names that contain only white space characters without contacting the name service. - SR: 8606282247 Variable replacement is now performed for all Message Defaults fields. - SR: B555008674 The man pages for opcagt and opcragt now document the new option '-version'. The message catalog was updated to show the '-version' option in the usage string of the opcagt and opcragt commands. - SR: B555010955 The non-root user is added to the startup configuration file but not used. - SR: B555014215 Using the new opcinfo variable SNMP_TRAP_PORT opctrapi can now be configured to listen on another port than 162. This is only effective, if traps are not received through the NNM pmd. - SR: B555014574 With this changes the opcctla is now be able to deal with a running opcctla that is not reachable via RPC: opcagt -status will display a warning if the currently running opcctla is not reachable over RPC, but then it will display the status according to the pids file. opcagt -stop will also kill the unresponsive opcctla and try to start a new one. If opcctla is not reachable over RPC, opcagt -start will kill all running agent processes and then start a new opcctla which starts the agent processes. But of course, the agent won't be able to start if RPC is still not available at that time. - SR: B555014591 The defect was caused by the double conversion from the server code set to the internal code set, once on the agents side when it sent the internal message to opcmsga and once by opcmsgi when it forwarded the message again. Now, the message is converted back from the internal code set to the server code set in opcmsga before sending the message to the opcmsgi queue. The management server will get the message through opcmsga in the internal code set, and it will convert it into the server code set. The conversion is made only if the internal code set is different from the server code set. - SR: B555014851 opcmsga maintains an internal cache to find out the target managers per message ID. The cache expires after 1 hour (can be changed with the opcsvinfo variable OPC_STORE_TIME_FOR_MGR_INFO) and then it runs into a problem in its algorithm so that a message operation on a non-cached message is sent again and again until the last target manager in an internal list can be reached. - SR: B555015331 opcmona holds a central table for all subprocess related information. Advanced monitors are executed in separate threads and could access this table in parallel, thus overwriting each other's data. The table accesses are now serialized by a mutex. - SR: R555008715 Now all characters that have a special meaning for the pattern matching (^ | $ [ ] < > \) are properly escaped. PHSS_27298: - SR: H555006719 When a communication to a message receiver fails, the message agent starts buffering messages. It periodically checks if a server is alive by sending it ICMP packets. If the server cannot be reached with ICMP packets, no RPC communication is attempted. Sending ICMP packets is not possible when the agent is running as a non-root user, so the sending function cannot actually send anything. Therefore we also never receive any replies and the message agent will buffer messages forever. To fix this, the internal state of the message agent is updated after we tried to send an ICMP packet if the agent is running as a non-root user. - SR: H555006934 It is necessary to check nodeinfo file and then set appropriate language (same like in nodeinfo) in start-up script to start agent. There are not all releases known where this defect should be included.This should be checked from time to time. - SR: 8606213476 While the agent receives several RPC calls, like "Start Distribution", "Execute Action" or "Set Primary Manager" in parallel, it may happen that the call results in a conflict within the control agent, which causes the control agent to bring a Dr. Watson window. This conflict can also occur on UNIX but the control agent doesn't die, rather the RPC request may fail. With this version the RPC calls which could cause conflicts are serialised. - SR: 8606217814 To avoid problems with different shells an interpreting shell has been defined explicitly. - SR: 8606222554 The condition test for the message attributes application, object and message group are always done case sensitive, therefore a message with the application "TEST" matches but "tEST" does not match. With this patch an opcinfo flag is introduced, which allows to switch between case sensitive and case insensitive check. flag : OPC_COND_FIELD_ICASE type : boolean default: FALSE By setting this to true the policies mentioned above will work. - SR: 8606242614 The variables <$LOGFILE> and <$LOGPATH> were replaced after the suppression rules were evaluated. Therefore the comparison did not use the actual logfile name or path, but compared the string "<$LOGFILE>" or "<$LOGPATH>". - SR: 8606244523 The opcle aborts when a <`script`> returns a logfile name twice. This is because of referencing and handling the same file twice. To fix this, the opcle checks whether the same logfile has been returned already and when adding to the internal list, each logfile gets added exactly once. - SR: B555010966 The processing of the key relation is wrong for the log file encapsulator. The problem is that all unresolved entries followed by a resolved entry are removed. Other unresolved entries are kept as is. - SR: B555011184 The working directory for the ITO agent was changed from /opt/OV/bin/OpC to /tmp to avoid problems if the agent is running in an MC/SG environment. - SR: B555011638 The pattern matching could not match the new line character(s) of multi line messages. The following changes have been made to allow this: It is now possible to use ^M (\r) as a field separator. A new pattern was introduced to match any number of line breaks (UNIX style \n or NT style \r\n). will match exactly n line breaks, for example <1/> will match exactly one line break. This change works only for sources that already can create multi line messages (for example opcmsg or NT event log), it does not allow multi line logfile encapsulation. This change requires changes for management server and agent. Therefore a patch for the management server and a patch for the agent is required to use the new functionality. - SR: B555012210 Linked with new ECS runtime library that contains a fix for this problem. - SR: B555012929 The opcdista communicates with the opcctla process via stdin/stdout so if you run it from commandline, you only see the status letters but don't know what they mean. The new '-v' option prints more output, e.g.: $ ./opcdista -v 0 - No distribution data available. - SR: B555013435 One thread tried to read from a socket while another thread closed it. This could happen due to missing locking of global data. This data is now guarded by a mutex. - SR: B555013495 When tracing was added to the API functions a necessary NLS initialisation was not done. This problem was introduced only by the A.06.10 patches for HPUX. - SR: B555013620 NNM 6.2 introduced an event option to pmd - "u". This option specifies to prefer the IP address in an SNMPv1 trap's UDP header over the contents of the SNMPv1 trap PDU's agent_addr field. A new opcinfo variable OPC_USE_UDP_AS_TRAP_SOURCE was added for opctrapi. If set to TRUE, opctrapi will use the UDP address instead of the agent_addr. - SR: B555013719 Message agent remains in buffering mode even when the management server is available again. The reason is that the agent wasn't able to resolve the management server name to an IP address at startup and the agent doesn't try again during runtime. This has been fixed by checking for a resolvable name every time a message should be buffered till the name can be resolved, after this the normal checkalive mechanism which handles buffered message takes place. - SR: B555013891 Even if the IP address of the management server was specified in the mgrconf file, it was not used except for the primary manager. This behaviour was changed to give the mgrconf file precedence over name resolution. - SR: B555014093 opcmona may crash (UNIX) or doesn't process all SCHEDULE templates (Windows) when using SCHEDULE templates. This can occur when there are only spaces in one of the schedule fields (Minute, Hour, Day of the Month, Month, Year, Day of the Week). You can verify this by going to the conf/OpC directory on the node and doing an opcdcode monitor. When there are entries like WEEKDAY " " the problem can occur. Now the monitor agent treats sequences of spaces like an empty string that is a wildcard and uses all valid values in the possible range. For WEEKDAY this is 0-6. - SR: B555014132 During a distribution the agent may report error number OPC30-1203/OPC20-63 when trying to access the cfgchg file. The cause for this problem is that there are several processes trying to get exclusive access to this file at the same time. The problem has been fixed by doing a retry for 10 times in case the error should occur with a delay of one second. for SR's not listed in this section please see the list of symptoms PHSS_24641: check the list of the symptoms Resolution: check the list of the symptoms PHSS_24126: NSMbb40742 - SR: B555010341 When the process ID of the 'opcctla -start' is the same as of the running opcctla before the shutdown, the internal logic concluded that the agent is already running and did not start up the subprocesses. for all other defects not listed in this section please see the list of symptoms Resolution: check the list of the symptoms PHSS_23825: check the list of the symptoms Resolution: check the list of the symptoms PHSS_22886: check the list of the symptoms Resolution: check the list of the symptoms PHSS_22256: check the list of the symptoms Resolution: check the list of the symptoms Enhancement: No SR: R555008715 H555010800 H555010228 H555008631 H555008602 H555008582 H555008553 H555008529 H555008352 H555008275 H555007666 H555006934 H555006719 H555005837 H555005793 H555005720 H555005007 H555004515 H555003664 H555003507 H555003505 H555003466 H555003277 H555002616 B555015758 B555015626 B555015331 B555014942 B555014851 B555014715 B555014596 B555014591 B555014574 B555014245 B555014215 B555014132 B555014093 B555013891 B555013794 B555013719 B555013620 B555013495 B555013435 B555013371 B555012929 B555012210 B555011990 B555011979 B555011638 B555011594 B555011505 B555011184 B555010980 B555010966 B555010955 B555010948 B555010899 B555010879 B555010620 B555010341 B555010079 B555009745 B555009212 B555009155 B555009152 B555008838 B555008674 B555008314 B555008220 B555007980 B555007752 B555007709 B555007602 B555007426 B555006890 B555006267 B553000162 8606282247 8606262299 8606244523 8606242614 8606233602 8606232431 8606227840 8606222554 8606217814 8606213476 8606203798 8606187183 8606181988 8606180891 8606180583 8606146160 8606141434 8606137088 Patch Files: OVOPC-CLT.OVOPC-SOL-CLT,fr=A.06.00,fa=HP-UX_B.11.00_32/64, v=HP: /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/AgentPlatform /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/dist_del.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/mailq_pr.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/st_inetd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/st_mail.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/ st_syslogd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/E10000Log.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opcdf.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opclpst.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opcps.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/ssp_config.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcnsl /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrclchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrdschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrndchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcroschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrverchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrinst /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ana_disk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/cpu_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/disk_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/dist_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/last_logs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/mailq_l.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/mondbfile.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/opcfwtmp.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/proc_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/sh_procs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ssp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/swap_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ vcs_monitor.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/vp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/opc_pkg.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/dist_del.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/mailq_pr.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/st_inetd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/st_mail.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/st_syslogd.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/E10000Log.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/opcdf.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/opclpst.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/opcps.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/ssp_config.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcnsl /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrclchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrdschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrinst /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrndchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcroschk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrverchk /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/ana_disk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/cpu_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/disk_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/dist_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/last_logs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/mailq_l.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/mondbfile.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/opcfwtmp.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/proc_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/sh_procs.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/ssp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/swap_mon.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/vcs_monitor.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/vp_chk.sh.Z /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/opc_pkg.Z /var/opt/OV/share/tmp/OpC_appl/new_syslog/C/TEMPLATES/ LOGFILE/logfile.dat /var/opt/OV/share/tmp/OpC_appl/new_syslog/C/new_sysl.idx /var/opt/OV/share/tmp/OpC_appl/new_syslog/ja_JP.SJIS/ TEMPLATES/LOGFILE/logfile.dat /var/opt/OV/share/tmp/OpC_appl/new_syslog/ja_JP.SJIS/ new_sysl.idx /opt/OV/OpC/examples/progs/Makef.solaris /opt/OV/OpC/examples/progs/Makef.solarisdce what(1) Output: OVOPC-CLT.OVOPC-SOL-CLT,fr=A.06.00,fa=HP-UX_B.11.00_32/64, v=HP: /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/AgentPlatform: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/dist_del.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/mailq_pr.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/st_inetd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/st_mail.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/actions/ st_syslogd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/E10000Log.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opcdf.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opclpst.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/opcps.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/cmds/ssp_config.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcnsl: pow.c 1.32 96/08/21 SMI _TBL_exp2.c 1.8 93/09/07 SMI _TBL_log2.c 1.7 93/09/07 SMI matherr.c 1.10 93/09/07 SMI HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) __libx_errno.c 1.11 96/05/08 SMI stdio.h 1.39 95/12/04 SMI feature_tests.h 1.7 94/12/06 SMI va_list.h 1.6 96/01/26 SMI types.h 1.38 95/11/14 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI socket.h 1.15 95/02/24 SMI netconfig.h 1.13 95/02/24 SMI sockio.h 1.11 93/10/26 SMI ioccom.h 1.10 92/07/14 SMI in.h 1.4 93/07/06 SMI stream.h 1.56 94/09/28 SMI vnode.h 1.54 98/10/09 SMI t_lock.h 1.42 94/11/02 SMI machlock.h 1.14 94/10/20 SMI dki_lkinfo.h 1.8 93/05/03 SMI dl.h 1.13 93/08/18 SMI sleepq.h 1.17 94/07/29 SMI turnstile.h 1.27 94/10/27 SMI param.h 1.34 95/11/05 SMI unistd.h 1.24 95/08/24 SMI pirec.h 1.11 93/12/20 SMI mutex.h 1.14 94/07/29 SMI rwlock.h 1.3 94/07/29 SMI semaphore.h 1.4 94/07/29 SMI condvar.h 1.6 94/07/29 SMI cred.h 1.18 94/12/04 SMI uio.h 1.21 94/04/22 SMI seg_enum.h 1.1 93/04/03 SMI poll.h 1.19 94/08/31 SMI strmdep.h 1.8 92/07/14 SMI byteorder.h 1.9 94/01/04 SMI netdb.h 1.17 97/12/23 SMI ctype.h 1.19 95/01/28 SMI feature_tests.h 1.7 94/12/06 SMI limits.h 1.29 96/01/11 SMI isa_defs.h 1.7 94/10/26 SMI errno.h 1.13 95/09/10 SMI errno.h 1.15 95/01/22 SMI types.h 1.38 95/11/14 SMI feature_tests.h 1.7 94/12/06 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI types.h 1.38 95/11/14 SMI feature_tests.h 1.7 94/12/06 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI unistd.h 1.33 95/08/28 SMI unistd.h 1.24 95/08/24 SMI utsname.h 1.24 95/07/14 SMI errno.h 1.13 95/09/10 SMI errno.h 1.15 95/01/22 SMI in.h 1.4 93/07/06 SMI stream.h 1.56 94/09/28 SMI vnode.h 1.54 98/10/09 SMI t_lock.h 1.42 94/11/02 SMI machlock.h 1.14 94/10/20 SMI dki_lkinfo.h 1.8 93/05/03 SMI dl.h 1.13 93/08/18 SMI sleepq.h 1.17 94/07/29 SMI turnstile.h 1.27 94/10/27 SMI param.h 1.34 95/11/05 SMI pirec.h 1.11 93/12/20 SMI mutex.h 1.14 94/07/29 SMI rwlock.h 1.3 94/07/29 SMI semaphore.h 1.4 94/07/29 SMI condvar.h 1.6 94/07/29 SMI cred.h 1.18 94/12/04 SMI uio.h 1.21 94/04/22 SMI seg_enum.h 1.1 93/04/03 SMI poll.h 1.19 94/08/31 SMI strmdep.h 1.8 92/07/14 SMI byteorder.h 1.9 94/01/04 SMI socket.h 1.15 95/02/24 SMI netconfig.h 1.13 95/02/24 SMI sockio.h 1.11 93/10/26 SMI ioccom.h 1.10 92/07/14 SMI netdb.h 1.15 96/05/29 SMI types.h 1.38 95/11/14 SMI feature_tests.h 1.7 94/12/06 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI unistd.h 1.33 95/08/28 SMI unistd.h 1.24 95/08/24 SMI socket.h 1.15 95/02/24 SMI netconfig.h 1.13 95/02/24 SMI sockio.h 1.11 93/10/26 SMI ioccom.h 1.10 92/07/14 SMI ipc.h 1.15 94/09/03 SMI un.h 1.8 92/07/14 SMI netdb.h 1.15 96/05/29 SMI in.h 1.4 93/07/06 SMI stream.h 1.56 94/09/28 SMI vnode.h 1.54 98/10/09 SMI t_lock.h 1.42 94/11/02 SMI machlock.h 1.14 94/10/20 SMI dki_lkinfo.h 1.8 93/05/03 SMI dl.h 1.13 93/08/18 SMI sleepq.h 1.17 94/07/29 SMI turnstile.h 1.27 94/10/27 SMI param.h 1.34 95/11/05 SMI pirec.h 1.11 93/12/20 SMI mutex.h 1.14 94/07/29 SMI rwlock.h 1.3 94/07/29 SMI semaphore.h 1.4 94/07/29 SMI condvar.h 1.6 94/07/29 SMI cred.h 1.18 94/12/04 SMI uio.h 1.21 94/04/22 SMI seg_enum.h 1.1 93/04/03 SMI poll.h 1.19 94/08/31 SMI strmdep.h 1.8 92/07/14 SMI byteorder.h 1.9 94/01/04 SMI inet.h 1.8 92/07/14 SMI in_systm.h 1.4 93/02/04 SMI ip.h 1.4 93/08/18 SMI ip_icmp.h 1.2 93/02/04 SMI /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrclchk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrdschk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrndchk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcroschk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrverchk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/install/opcrinst: HP OpenView VantagePoint for Sun Solaris A.06.15.1 ( 08/27/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ana_disk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/cpu_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/disk_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/dist_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ last_logs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/mailq_l.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ mondbfile.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/opcfwtmp.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/proc_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/sh_procs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ssp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/swap_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/ vcs_monitor.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/monitor/vp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_DCE_TCP/opc_pkg.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/dist_del.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/mailq_pr.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/st_inetd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/st_mail.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/actions/st_syslogd.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/E10000Log.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/opcdf.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/opclpst.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/opcps.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/cmds/ssp_config.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcnsl: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) stdio.h 1.39 95/12/04 SMI feature_tests.h 1.7 94/12/06 SMI va_list.h 1.6 96/01/26 SMI types.h 1.38 95/11/14 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI socket.h 1.15 95/02/24 SMI netconfig.h 1.13 95/02/24 SMI sockio.h 1.11 93/10/26 SMI ioccom.h 1.10 92/07/14 SMI in.h 1.4 93/07/06 SMI stream.h 1.56 94/09/28 SMI vnode.h 1.54 98/10/09 SMI t_lock.h 1.42 94/11/02 SMI machlock.h 1.14 94/10/20 SMI dki_lkinfo.h 1.8 93/05/03 SMI dl.h 1.13 93/08/18 SMI sleepq.h 1.17 94/07/29 SMI turnstile.h 1.27 94/10/27 SMI param.h 1.34 95/11/05 SMI unistd.h 1.24 95/08/24 SMI pirec.h 1.11 93/12/20 SMI mutex.h 1.14 94/07/29 SMI rwlock.h 1.3 94/07/29 SMI semaphore.h 1.4 94/07/29 SMI condvar.h 1.6 94/07/29 SMI cred.h 1.18 94/12/04 SMI uio.h 1.21 94/04/22 SMI seg_enum.h 1.1 93/04/03 SMI poll.h 1.19 94/08/31 SMI strmdep.h 1.8 92/07/14 SMI byteorder.h 1.9 94/01/04 SMI netdb.h 1.17 97/12/23 SMI types.h 1.38 95/11/14 SMI feature_tests.h 1.7 94/12/06 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI types.h 1.38 95/11/14 SMI feature_tests.h 1.7 94/12/06 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI unistd.h 1.33 95/08/28 SMI unistd.h 1.24 95/08/24 SMI utsname.h 1.24 95/07/14 SMI errno.h 1.13 95/09/10 SMI errno.h 1.15 95/01/22 SMI in.h 1.4 93/07/06 SMI stream.h 1.56 94/09/28 SMI vnode.h 1.54 98/10/09 SMI t_lock.h 1.42 94/11/02 SMI machlock.h 1.14 94/10/20 SMI dki_lkinfo.h 1.8 93/05/03 SMI dl.h 1.13 93/08/18 SMI sleepq.h 1.17 94/07/29 SMI turnstile.h 1.27 94/10/27 SMI param.h 1.34 95/11/05 SMI pirec.h 1.11 93/12/20 SMI mutex.h 1.14 94/07/29 SMI rwlock.h 1.3 94/07/29 SMI semaphore.h 1.4 94/07/29 SMI condvar.h 1.6 94/07/29 SMI cred.h 1.18 94/12/04 SMI uio.h 1.21 94/04/22 SMI seg_enum.h 1.1 93/04/03 SMI poll.h 1.19 94/08/31 SMI strmdep.h 1.8 92/07/14 SMI byteorder.h 1.9 94/01/04 SMI socket.h 1.15 95/02/24 SMI netconfig.h 1.13 95/02/24 SMI sockio.h 1.11 93/10/26 SMI ioccom.h 1.10 92/07/14 SMI netdb.h 1.15 96/05/29 SMI types.h 1.38 95/11/14 SMI feature_tests.h 1.7 94/12/06 SMI isa_defs.h 1.7 94/10/26 SMI machtypes.h 1.9 94/11/05 SMI select.h 1.10 92/07/14 SMI time.h 2.47 95/08/24 SMI time.h 1.23 95/08/28 SMI siginfo.h 1.36 95/08/24 SMI machsig.h 1.10 94/11/05 SMI unistd.h 1.33 95/08/28 SMI unistd.h 1.24 95/08/24 SMI socket.h 1.15 95/02/24 SMI netconfig.h 1.13 95/02/24 SMI sockio.h 1.11 93/10/26 SMI ioccom.h 1.10 92/07/14 SMI ipc.h 1.15 94/09/03 SMI un.h 1.8 92/07/14 SMI netdb.h 1.15 96/05/29 SMI in.h 1.4 93/07/06 SMI stream.h 1.56 94/09/28 SMI vnode.h 1.54 98/10/09 SMI t_lock.h 1.42 94/11/02 SMI machlock.h 1.14 94/10/20 SMI dki_lkinfo.h 1.8 93/05/03 SMI dl.h 1.13 93/08/18 SMI sleepq.h 1.17 94/07/29 SMI turnstile.h 1.27 94/10/27 SMI param.h 1.34 95/11/05 SMI pirec.h 1.11 93/12/20 SMI mutex.h 1.14 94/07/29 SMI rwlock.h 1.3 94/07/29 SMI semaphore.h 1.4 94/07/29 SMI condvar.h 1.6 94/07/29 SMI cred.h 1.18 94/12/04 SMI uio.h 1.21 94/04/22 SMI seg_enum.h 1.1 93/04/03 SMI poll.h 1.19 94/08/31 SMI strmdep.h 1.8 92/07/14 SMI byteorder.h 1.9 94/01/04 SMI inet.h 1.8 92/07/14 SMI in_systm.h 1.4 93/02/04 SMI ip.h 1.4 93/08/18 SMI ip_icmp.h 1.2 93/02/04 SMI /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrclchk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrdschk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrinst: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrndchk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcroschk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/install/opcrverchk: HP OpenView VantagePoint for Sun Solaris A.06.15 (04 /28/03) /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/ana_disk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/cpu_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/disk_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/dist_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/last_logs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/mailq_l.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/mondbfile.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/opcfwtmp.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/proc_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/sh_procs.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/ssp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/swap_mon.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/vcs_monitor.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/monitor/vp_chk.sh.Z: None /var/opt/OV/share/databases/OpC/mgd_node/vendor/sun/sparc/ solaris/A.06.15.1/RPC_NCS/opc_pkg.Z: None /var/opt/OV/share/tmp/OpC_appl/new_syslog/C/TEMPLATES/ LOGFILE/logfile.dat: None /var/opt/OV/share/tmp/OpC_appl/new_syslog/C/new_sysl.idx: None /var/opt/OV/share/tmp/OpC_appl/new_syslog/ja_JP.SJIS/ TEMPLATES/LOGFILE/logfile.dat: None /var/opt/OV/share/tmp/OpC_appl/new_syslog/ja_JP.SJIS/ new_sysl.idx: None /opt/OV/OpC/examples/progs/Makef.solaris: None /opt/OV/OpC/examples/progs/Makef.solarisdce: None cksum(1) Output: OVOPC-CLT.OVOPC-SOL-CLT,fr=A.06.00,fa=HP-UX_B.11.00_32/64, v=HP: 2177823152 4380 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/AgentPlatform 1032807215 6512 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ actions/dist_del.sh.Z 3282119219 2567 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ actions/mailq_pr.sh.Z 3828763907 2663 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ actions/st_inetd.sh.Z 3555142267 2608 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ actions/st_mail.sh.Z 3363692505 2666 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ actions/st_syslogd.sh.Z 2572702873 4231 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/cmds/ E10000Log.sh.Z 3555398820 14040 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/cmds/ opc_sec_v.sh.Z 1612411984 349 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/cmds/ opcdf.Z 4013463814 410 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/cmds/ opclpst.Z 3740945589 424 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/cmds/ opcps.Z 1456448797 4025 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/cmds/ ssp_config.sh.Z 2651261519 33304 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcnsl 3932989469 28090 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcrclchk 4282446215 29760 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcrdschk 535482626 29793 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcrndchk 2630383912 6462 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcroschk 3538122812 28472 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcrverchk 4058032708 132878 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ install/opcrinst 2020593970 2776 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/ana_disk.sh.Z 3780972350 6356 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/cpu_mon.sh.Z 2643478561 6538 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/disk_mon.sh.Z 2124188313 6497 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/dist_mon.sh.Z 230639919 6239 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/last_logs.sh.Z 2391399744 6250 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/mailq_l.sh.Z 2189379747 14825 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/mondbfile.sh.Z 2722553490 20658 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/opcfwtmp.Z 116169570 6378 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/proc_mon.sh.Z 2468395146 5871 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/sh_procs.sh.Z 1215882578 6395 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/ssp_chk.sh.Z 222352248 6135 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/swap_mon.sh.Z 2827130520 6978 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/vcs_monitor.sh.Z 3147664177 6096 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ monitor/vp_chk.sh.Z 2729322584 6273265 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_DCE_TCP/ opc_pkg.Z 1032807215 6512 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/actions/ dist_del.sh.Z 3282119219 2567 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/actions/ mailq_pr.sh.Z 3828763907 2663 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/actions/ st_inetd.sh.Z 3555142267 2608 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/actions/ st_mail.sh.Z 3363692505 2666 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/actions/ st_syslogd.sh.Z 2572702873 4231 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/cmds/ E10000Log.sh.Z 1612411984 349 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/cmds/ opcdf.Z 4013463814 410 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/cmds/ opclpst.Z 3740945589 424 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/cmds/ opcps.Z 1456448797 4025 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/cmds/ ssp_config.sh.Z 2386894143 11164 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcnsl 3932989469 28090 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcrclchk 4282446215 29760 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcrdschk 4154201093 133021 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcrinst 535482626 29793 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcrndchk 2630383912 6462 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcroschk 3538122812 28472 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/install/ opcrverchk 2020593970 2776 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ ana_disk.sh.Z 3780972350 6356 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ cpu_mon.sh.Z 2643478561 6538 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ disk_mon.sh.Z 2124188313 6497 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ dist_mon.sh.Z 230639919 6239 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ last_logs.sh.Z 2391399744 6250 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ mailq_l.sh.Z 2189379747 14825 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ mondbfile.sh.Z 2545704791 5520 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ opcfwtmp.Z 116169570 6378 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ proc_mon.sh.Z 2468395146 5871 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ sh_procs.sh.Z 1215882578 6395 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ ssp_chk.sh.Z 222352248 6135 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ swap_mon.sh.Z 2827130520 6978 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ vcs_monitor.sh.Z 3147664177 6096 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/monitor/ vp_chk.sh.Z 3420844123 4147855 /var/opt/OV/share/databases/OpC/mgd_node/ vendor/sun/sparc/solaris/A.06.15.1/RPC_NCS/opc_pkg.Z 2950102209 7107 /var/opt/OV/share/tmp/OpC_appl/new_syslog/C/ TEMPLATES/LOGFILE/logfile.dat 675288195 557 /var/opt/OV/share/tmp/OpC_appl/new_syslog/C/ new_sysl.idx 610799592 5231 /var/opt/OV/share/tmp/OpC_appl/new_syslog/ ja_JP.SJIS/TEMPLATES/LOGFILE/logfile.dat 1991845724 553 /var/opt/OV/share/tmp/OpC_appl/new_syslog/ ja_JP.SJIS/new_sysl.idx 3316514381 1058 /opt/OV/OpC/examples/progs/Makef.solaris 3226737344 1146 /opt/OV/OpC/examples/progs/Makef.solarisdce Patch Conflicts: None Patch Dependencies: None Hardware Dependencies: None Other Dependencies: None Supersedes: PHSS_28863 PHSS_27298 PHSS_24641 PHSS_24126 PHSS_23825 PHSS_22886 PHSS_22256 Equivalent Patches: ITOSOL_00261: sparcSOL: 2.6 2.7 2.8 Patch Package Size: 10360 KBytes Installation Instructions: Please review all instructions and the Hewlett-Packard SupportLine User Guide or your Hewlett-Packard support terms and conditions for precautions, scope of license, restrictions, and, limitation of liability and warranties, before installing this patch. ------------------------------------------------------------ 1. Back up your system before installing a patch. 2. Login as root. 3. Copy the patch to the /tmp directory. 4. Move to the /tmp directory and unshar the patch: cd /tmp sh PHSS_29646 5. Run swinstall to install the patch: swinstall -x autoreboot=true -x patch_match_target=true \ -s /tmp/PHSS_29646.depot By default swinstall will archive the original software in /var/adm/sw/save/PHSS_29646. If you do not wish to retain a copy of the original software, include the patch_save_files option in the swinstall command above: -x patch_save_files=false WARNING: If patch_save_files is false when a patch is installed, the patch cannot be deinstalled. Please be careful when using this feature. For future reference, the contents of the PHSS_29646.text file is available in the product readme: swlist -l product -a readme -d @ /tmp/PHSS_29646.depot To put this patch on a magnetic tape and install from the tape drive, use the command: dd if=/tmp/PHSS_29646.depot of=/dev/rmt/0m bs=2k Special Installation Instructions: NOTE: This patch contains an updated "Syslog (Solaris)" template. If you want to update the template in your DB, please enter: # opccfgupld -add -subentity new_syslog This will add the new conditions to the "Syslog (Solaris)" template. In the Admin GUI you can now remove the template "Kernel Logs (Solaris)" as the conditions from the default template "Kernel Logs (Solaris)" have been added to the template "Syslog (Solaris)". If you have modified the default "Kernel Logs (Solaris)" template you might want to compare the "Syslog" and the "Kernel Logs" template and decide whether you still need the "Kernel Logs" template or not. (A) Patch Installation Instructions ------------------------------- (A1) Install the patch, following the standard installation instructions provided above under "Installation Instructions". Observe that you can use opc_backup(5) for backing up your system before installing a patch. NOTE: Make sure that no agent of the platform addressed by this patch is distributed (either from the ITO Administrator's GUI or from command line using inst.sh) while running swremove. If you are running VPO in a MC/ServiceGuard installation: - Note, that only files on the shared disk volume at /var/opt/OV/share will be patched. Therefore install the patch on one cluster node while the shared disks are mounted. The server processes may be running during patch installation. - It is not necessary to install this patch on all cluster nodes. Even if the software inventory on the other cluster nodes will not be updated, the patched files will be available there when the shared disk is switched to them. NOTE: This patch must be installed on the VPO Management Server system, NOT on an VPO Managed Node directly. Changes will take effect on managed nodes by means of VPO Software Distribution (using 'Force Update' if there is already an agent installed on the managed node). See chapter 2 of the VPO Administrator's Reference manual for more information. (B) Patch Deinstallation Instructions --------------------------------- NOTE: Make sure that no agent of the platform addressed by this patch is distributed (either from the ITO Administrator's GUI or from command line using inst.sh) while running swremove. If you are running VPO in a MC/ServiceGuard installation make sure to mount the shared disks at the node and only at the node that had them mounted during patch installation. Otherwise restoration of the original files onto the shared disk will fail.