I had a scenario where i need to shutdown my windows machine after a particular period of time . I used the windows shutdown command to do this task and it did pretty well.
shutdown -s -f -t 3600
Windows + UN schedule the shutdown
shutdown -a
Wednesday, 29 December 2010
How to set svn:externals
svn:externals is the beautiful option subversion provides to just point another repository content to your repository and it checkouts when your repository is checked out or updated. Due to this you don't have the same contents in multiple repository occupying the disk space and easy maintenance like don't have to apply your changes to all the repository. Change in one repository and get the changes in all the other repository using svn:externals . Get more details from http://svnbook.red-bean.com/en/1.0/ch07s03.html and below is my svn externals settings,
cat externals.txt
madhu http://localhost/svn/madhu/test
---------------------------------------------------------------------------------------------------------------------------------------------------
svn propset -F externals.txt .
property 'svn:externals' set on '.'
svn di
Property changes on: .
___________________________________________________________________
Added: svn:externals
+ madhu http://localhost/svn/madhu/test
svn ci -m "set externals"
Sending .
Committed revision 8.
svn up
Fetching external item into 'madhu'
A madhu\New Bitmap Image.bmp
Updated external to revision 9.
Updated to revision 8.
C:\repo>svn info
URL: http://localhost/svn/repo
C:\repo\madhu>svn info
URL: http://localhost/svn/madhu/test
svn propget svn:externals
madhu http://localhost/svn/madhu/test
svn propdel svn:externals
property 'svn:externals' deleted from '.'
cat externals.txt
madhu http://localhost/svn/madhu/test
---------------------------------------------------------------------------------------------------------------------------------------------------
svn propset -F externals.txt .
property 'svn:externals' set on '.'
svn di
Property changes on: .
___________________________________________________________________
Added: svn:externals
+ madhu http://localhost/svn/madhu/test
svn ci -m "set externals"
Sending .
Committed revision 8.
svn up
Fetching external item into 'madhu'
A madhu\New Bitmap Image.bmp
Updated external to revision 9.
Updated to revision 8.
C:\repo>svn info
URL: http://localhost/svn/repo
C:\repo\madhu>svn info
URL: http://localhost/svn/madhu/test
svn propget svn:externals
madhu http://localhost/svn/madhu/test
svn propdel svn:externals
property 'svn:externals' deleted from '.'
Thursday, 16 December 2010
Hudson Jobs in the subversion repository
Everyday morning the first thing i look after entering the office is Hudson. It doesn't mean i am an expert in hudson, i am still in the learning phase .We decided to bring all the Hudson contents to subversion ,that's the version control tool we use to maintain our source code. Once we move everything in to subversion we no need to worry about losing the data and its easy for maintenace. As our company follow the Agile methodology the continuous integration plays an important role that's where Hudson comes in to picture. So as the first step towards this, we moved the Jobs directory in to the subversion. This task was given to me ,already there was a repository created for committing the Hudson content's .So i decided to use `svn import` as this doesn't need a working copy to be checked out and it also creates the intermediate directory if it doesn't exist in the repository but import don't accept multiple arguments so can't pass all the jobs in a single command and also can't import the whole jobs directory in a single shot as the jobs directory size was more than 20GB and we weren't sure whether subversion can handle such large data ,so decided to write a shell script which will do the import one by one by going through the loop. Below is the shell script which does this work.
cat svn-import.sh
./svn-import.sh password
#! /bin/bash
import_dir=(5.4 5.4.1 6.1 6.0.2)
for i in ${import_dir[@]}
do
echo "processing $i"
for f in $i\*
do
:
for a in $f
do
b=`echo $a| cut -d '/' -f4`
echo `svn import $a/* https://forge.collab.net/svn/repos/hudson/relenghub/jobs/$a/* -m "[artf78984] Adding the $b hudson jobs to subversion for easy maintenance and recovery" --username=madhu --password=$1 --no-auth-cache` &> "/tmp/import.log"
done done
done
And used the same script to copy the files from one directory to another,
cat copy.sh
./copy.sh
#!/bin/bash
import_dir=(5.4.1 6.1 6.0.2)
for i in ${import_dir[@]}
do
echo "processing $i"
for f in $i\*
do
echo $f
for a in $f
do
/bin/cp -v -r $a/2010-12-1[1,2,3]* /tmp/jobs/$a/
/bin/cp -v -r $a/builds/2010-12-1[1,2,3]* /tmp/jobs/$a/builds/
/bin/cp -v -r $a/config.xml /tmp/jobs/$a/config.xml
done done
done
cat svn-import.sh
./svn-import.sh password
#! /bin/bash
import_dir=(5.4 5.4.1 6.1 6.0.2)
for i in ${import_dir[@]}
do
echo "processing $i"
for f in $i\*
do
:
for a in $f
do
b=`echo $a| cut -d '/' -f4`
echo `svn import $a/* https://forge.collab.net/svn/repos/hudson/relenghub/jobs/$a/* -m "[artf78984] Adding the $b hudson jobs to subversion for easy maintenance and recovery" --username=madhu --password=$1 --no-auth-cache` &> "/tmp/import.log"
done done
done
And used the same script to copy the files from one directory to another,
cat copy.sh
./copy.sh
#!/bin/bash
import_dir=(5.4.1 6.1 6.0.2)
for i in ${import_dir[@]}
do
echo "processing $i"
for f in $i\*
do
echo $f
for a in $f
do
/bin/cp -v -r $a/2010-12-1[1,2,3]* /tmp/jobs/$a/
/bin/cp -v -r $a/builds/2010-12-1[1,2,3]* /tmp/jobs/$a/builds/
/bin/cp -v -r $a/config.xml /tmp/jobs/$a/config.xml
done done
done
Thursday, 4 November 2010
Hudson -Continuous Integration Tool
I have used Hudson tools for the past 2 years, so i thought why not write about it features. It is a great tool and first thing i like about Hudson is the User friendly Web interface.Hudson has lot of features but i know a very few from the user perspective that's what i will discuss here.
Before Going in to the details of Hudson ,let me describe my understanding of Continuous Integration, it is a process where the scheduled polling are done on the SCM like subversion,Git,CVS repositories for changes .If there is a change then the latest source code is checked out and the build happens,if the build fails during compilation then the committers are notified about the build failure . Once the build got completed, the tests are triggered to check the stability of the build. So it gives a quick process to check the build stability and fix it soon thus preventing the delay in product deliveries.
I have installed hudson in Redhat Linux and windows, below are the steps i did,
java -jar hudson.war --httpPort=8080 #this will make hudson to run in port 8080
Once the Hudson is up then all the setup is done via web UI itself and Hudson has tons of plugins like SCM plugins, Test report , Irc notifier, Email, Access control, Build wrappers.
Manage Hudson gives a Central configuration Management page for the Hudson Server where you can configure the Access control for Hudson, project based authorization, Irc notification, Email Notification. In my setup i run Hudson in CentOS with 4GB mem and 250GB hard disk.
I used Hudson in Master/Slave mode .Means i run Hudson server in one machine(Master) and configure individual machine as slave for it and run each job in that machine .The slave configuration is done in the Manage Node page . In windows, slave machine needs is launched via JNLP(Java Network Launching protocol) and in Linux using SSH(Secure Shell protocol) and slave machine need to have a directory dedicated to Hudson,you can also run multiple jobs in one slave)
Hudson directory structure in slave machine : ls /u1/hudson/workspace
madhu-job1 madhu-job2 madhu-job3 #I have three jobs running in this slave
Create a job in New Job page, i used to create a Build a free-style software project. After job is created configure the job by Allocating a slave to the job. Specify the repository URL in the Source Code Management section also use Poll SCM to mention how frequent the Hudson need to poll the repository.
Next Add Build Step to specify how to build the checked out source code, it has various options like Execute shell, ANT, Maven, python script , Groovy script, Windows batch command. I use Execute shell to run the build script as most of my builds are Linux based.
Before Going in to the details of Hudson ,let me describe my understanding of Continuous Integration, it is a process where the scheduled polling are done on the SCM like subversion,Git,CVS repositories for changes .If there is a change then the latest source code is checked out and the build happens,if the build fails during compilation then the committers are notified about the build failure . Once the build got completed, the tests are triggered to check the stability of the build. So it gives a quick process to check the build stability and fix it soon thus preventing the delay in product deliveries.
I have installed hudson in Redhat Linux and windows, below are the steps i did,
java -jar hudson.war --httpPort=8080 #this will make hudson to run in port 8080
Once the Hudson is up then all the setup is done via web UI itself and Hudson has tons of plugins like SCM plugins, Test report , Irc notifier, Email, Access control, Build wrappers.
Manage Hudson gives a Central configuration Management page for the Hudson Server where you can configure the Access control for Hudson, project based authorization, Irc notification, Email Notification. In my setup i run Hudson in CentOS with 4GB mem and 250GB hard disk.
I used Hudson in Master/Slave mode .Means i run Hudson server in one machine(Master) and configure individual machine as slave for it and run each job in that machine .The slave configuration is done in the Manage Node page . In windows, slave machine needs is launched via JNLP(Java Network Launching protocol) and in Linux using SSH(Secure Shell protocol) and slave machine need to have a directory dedicated to Hudson,you can also run multiple jobs in one slave)
Hudson directory structure in slave machine : ls /u1/hudson/workspace
madhu-job1 madhu-job2 madhu-job3 #I have three jobs running in this slave
Create a job in New Job page, i used to create a Build a free-style software project. After job is created configure the job by Allocating a slave to the job. Specify the repository URL in the Source Code Management section also use Poll SCM to mention how frequent the Hudson need to poll the repository.
Next Add Build Step to specify how to build the checked out source code, it has various options like Execute shell, ANT, Maven, python script , Groovy script, Windows batch command. I use Execute shell to run the build script as most of my builds are Linux based.
Tuesday, 28 September 2010
Vi Editor
I am a windows user and never tried to use Linux before this job .But my work involves Linux and most of the time i need to work in Linux shell prompt not GUI. As every new user of Linux shell uses vi editor i also started with it and below are the commands i learned , which i felt its worth sharing,
vi madhu - Opens a file
ESC + : + wq - saves the file and exit
ESC + : + q - Exits the file without saving the changes
ESC + : + w! - Saves the changes without exiting the file
ESC + : + $ - Moves the cursor to the last line of the file
ESC + : + 1 - Moves the cursor to the first line of the file
ESC + :set nonu - Removes the line number in the file
ESC + :set nu - Displays the line number in the file
ESC + /madhu - Searches for the string 'madhu' in tht file
ESC + :3 - Go to the 3 line
ESC + :w /tmp/newfile - Writes the contents of the current file to
'/tmp/newfile'
ESC +:1,10w /tmp/newfile - Writes the contents of line 1 to 10 to the
'/tmp/newfile'
ESC + :w! /tmp/newfile - Overwrites the file '/tmp/newfile' with this file's
content
Remember there are other text editors in Linux like Emacs, vim . I preferred vi as i felt it was easy to use.
vi madhu - Opens a file
ESC + : + wq - saves the file and exit
ESC + : + q - Exits the file without saving the changes
ESC + : + w! - Saves the changes without exiting the file
ESC + : + $ - Moves the cursor to the last line of the file
ESC + : + 1 - Moves the cursor to the first line of the file
ESC + :set nonu - Removes the line number in the file
ESC + :set nu - Displays the line number in the file
ESC + /madhu - Searches for the string 'madhu' in tht file
ESC + :3 - Go to the 3 line
ESC + :w /tmp/newfile - Writes the contents of the current file to
'/tmp/newfile'
ESC +:1,10w /tmp/newfile - Writes the contents of line 1 to 10 to the
'/tmp/newfile'
ESC + :w! /tmp/newfile - Overwrites the file '/tmp/newfile' with this file's
content
Remember there are other text editors in Linux like Emacs, vim . I preferred vi as i felt it was easy to use.
Saturday, 25 September 2010
PBL Delete based on date
I wrote a python script to delete the files from PBL(Public Build Library) .In PBL large number of files reside, as the nightly builds of every release gets uploaded here. To prevent space constraint we discuss with Testing team and delete the older builds which they no longer use like builds older than two days. But the deletion process needs to be done manually ,check for the files which are older than 2 days and delete in using cubit API client. So i decided to write a simple script which gets the user inputs like filename, retain files older than and command line arguments for cubit API client. Below is my script and the usage pattern,
#!/usr/bin/env python
from os.path import exists, join
from string import split
from os import pathsep
import urllib2, string
import datetime
import re
import subprocess
import os, sys
from optparse import OptionParser
parser = OptionParser()
parser.add_option("-u", "--user", dest="user", help="Enter the username of the PBl user")
parser.add_option("-k", "--key", dest="key", help="Enter the API key of the PBL user")
parser.add_option("-p", "--project", dest="project", help="Enter the project name of the file to be deleted")
parser.add_option("-l", "--api_url", dest="api_url", help="Enter the API url")
parser.add_option("-a", "--address", dest="address", help="Enter the exact site address where the file resides eg:https://mgr.cubit.sp.collab.net/pbl/releng/pub/test/")
parser.add_option("-f", "--filename", dest="filename", help="Enter the name of the file to be deleted")
parser.add_option("-t", "--project_type", dest="project_type", help="Enter the project type")
parser.add_option("--temp_file", dest="temp_file", help="Enter the name of the temporary file to write the webpage content for parsing")
parser.add_option("-d", "--delete_files_older_than", type ="int", dest="delete_files_older_than", help="Enter the number of days i.e the files older than this number of days will be deleted")
parser.add_option("-r", "--remote_path", dest="remote_path", help="Enter the remote path of the file eg:/test")
(options, args) = parser.parse_args()
if not options.user or not options.key or not options.project or not options.api_url or not options.address or not options.filename or not options.project_type or not options.temp_file or not options.delete_files_older_than or not options.remote_path:
sys.exit(parser.print_help())
if os.name == "nt":
pbl_exec = "pbl.exe"
else:
pbl_exec = "pbl.py"
search_path = os.getenv('PATH')
file_found = 0
paths = string.split(search_path, pathsep)
for path in paths:
if exists(join(path, pbl_exec)):
file_found = 1
break
if file_found == 1:
pass
else:
sys.exit("Unable to find PBL executable in the PATH,please install Cubit API Client or add it to the path")
current_date = datetime.datetime.now().date().strftime("%Y-%b-%d")
current_date = datetime.datetime.strptime(current_date, '%Y-%b-%d')
address = options.address + "/.cubit_pbl_index.txt"
website = urllib2.urlopen(address)
website_html = website.read()
f = open(options.temp_file, 'w')
f.write(website_html)
f.close()
a = open(options.temp_file, 'r')
for line in a:
if re.search(options.filename, line):
file_date = line.split(',')[1].split()[0]
file_date = datetime.datetime.strptime(file_date, '%Y-%b-%d')
diff = current_date - file_date
diff = diff.days
if diff >= options.delete_files_older_than:
file_to_delete = line.split(',')[0]
cmd = 'pbl delete -l %s -u %s -k %s -t %s -p %s -r %s/%s' %(options.api_url, options.user, options.key, options.project_type, options.project, options.remote_path, options.file_to_delete)
cmd = cmd.strip()
subprocess.call(cmd)
a.close()
if os.path.isfile(options.temp_file):
os.unlink(options.temp_file)
--------------------------------------------------------------------------------------------
C:\>python pbl_delete_win.py
User:madhu
Key:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Project:test
Project-type:pub
api_url:https://
Address:https://
filename:test
retain_files_older_than:1 Remote_path:test
pbl: Deleting test/test-1.txt
pbl: OK
pbl: Deleting test/test-2.txt
pbl: OK
pbl: Deleting test/test-3.txt
pbl: OK
pbl: Deleting test/test-4.txt
pbl: OK
Still this script needs to refined ,because the current implementation will work properly only if both client and server are there same time zone else the wrong files might get deleted and the wrong files might get deleted if the specified file name exist as a part in other file name .If u see the above output though i have passed "test" as the filename "test-1.txt" also got deleted ,this needs to be validated .
#!/usr/bin/env python
from os.path import exists, join
from string import split
from os import pathsep
import urllib2, string
import datetime
import re
import subprocess
import os, sys
from optparse import OptionParser
parser = OptionParser()
parser.add_option("-u", "--user", dest="user", help="Enter the username of the PBl user")
parser.add_option("-k", "--key", dest="key", help="Enter the API key of the PBL user")
parser.add_option("-p", "--project", dest="project", help="Enter the project name of the file to be deleted")
parser.add_option("-l", "--api_url", dest="api_url", help="Enter the API url")
parser.add_option("-a", "--address", dest="address", help="Enter the exact site address where the file resides eg:https://mgr.cubit.sp.collab.net/pbl/releng/pub/test/")
parser.add_option("-f", "--filename", dest="filename", help="Enter the name of the file to be deleted")
parser.add_option("-t", "--project_type", dest="project_type", help="Enter the project type")
parser.add_option("--temp_file", dest="temp_file", help="Enter the name of the temporary file to write the webpage content for parsing")
parser.add_option("-d", "--delete_files_older_than", type ="int", dest="delete_files_older_than", help="Enter the number of days i.e the files older than this number of days will be deleted")
parser.add_option("-r", "--remote_path", dest="remote_path", help="Enter the remote path of the file eg:/test")
(options, args) = parser.parse_args()
if not options.user or not options.key or not options.project or not options.api_url or not options.address or not options.filename or not options.project_type or not options.temp_file or not options.delete_files_older_than or not options.remote_path:
sys.exit(parser.print_help())
if os.name == "nt":
pbl_exec = "pbl.exe"
else:
pbl_exec = "pbl.py"
search_path = os.getenv('PATH')
file_found = 0
paths = string.split(search_path, pathsep)
for path in paths:
if exists(join(path, pbl_exec)):
file_found = 1
break
if file_found == 1:
pass
else:
sys.exit("Unable to find PBL executable in the PATH,please install Cubit API Client or add it to the path")
current_date = datetime.datetime.now().date().strftime("%Y-%b-%d")
current_date = datetime.datetime.strptime(current_date, '%Y-%b-%d')
address = options.address + "/.cubit_pbl_index.txt"
website = urllib2.urlopen(address)
website_html = website.read()
f = open(options.temp_file, 'w')
f.write(website_html)
f.close()
a = open(options.temp_file, 'r')
for line in a:
if re.search(options.filename, line):
file_date = line.split(',')[1].split()[0]
file_date = datetime.datetime.strptime(file_date, '%Y-%b-%d')
diff = current_date - file_date
diff = diff.days
if diff >= options.delete_files_older_than:
file_to_delete = line.split(',')[0]
cmd = 'pbl delete -l %s -u %s -k %s -t %s -p %s -r %s/%s' %(options.api_url, options.user, options.key, options.project_type, options.project, options.remote_path, options.file_to_delete)
cmd = cmd.strip()
subprocess.call(cmd)
a.close()
if os.path.isfile(options.temp_file):
os.unlink(options.temp_file)
--------------------------------------------------------------------------------------------
C:\>python pbl_delete_win.py
User:madhu
Key:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Project:test
Project-type:pub
api_url:https://
Address:https://
filename:test
retain_files_older_than:1 Remote_path:test
pbl: Deleting test/test-1.txt
pbl: OK
pbl: Deleting test/test-2.txt
pbl: OK
pbl: Deleting test/test-3.txt
pbl: OK
pbl: Deleting test/test-4.txt
pbl: OK
Still this script needs to refined ,because the current implementation will work properly only if both client and server are there same time zone else the wrong files might get deleted and the wrong files might get deleted if the specified file name exist as a part in other file name .If u see the above output though i have passed "test" as the filename "test-1.txt" also got deleted ,this needs to be validated .
Friday, 24 September 2010
LogBot
Oh yeah today i tried setting bot for the irc channel i use . This bot is to log the chats in the channel . Finally i used http://www.jibble.org/logbot/ and it worked fine in my windows machine with default configuration tokens but when i tried it in Linux box which uses different Access control than my windows box , failed with the password incorrect error . Then i saw there was no token for password in the default config file. So i did some basic debugging by adding print statement in the code to understand the flow of the code using 'System.out.println' and found password was passed as null.But the irc server i tried to connect needs password ,so i added a token called password='password' and passed it to the connect method and it worked fine. Even in windows the password is passed as null but this worked, maybe in the server-side there might be a configuration which says password not required if the request is from particular network. After this i had another problem the log file was displayed in the raw HTML format .Then i installed php in the Linux machine and configured Apache to load the php module because the Log bot application places the required php files in the log directory which processes the log file and displays it in a proper format.
LoadModule php5_module modules/libphp5.so
Also need to instruct apache to process the php files before sending it to client.
AddType application/x-httpd-php .php
Now the bot is running fine and logging the chats date-wise
But I want the logs to be only viewed by my team ,so i used the Apache to do this work for me using Apache's Authentication directive and configured LDAP authentication in it.
<Directory "/var/www/html">
AuthBasicProvider ldapAuthName "Enter password to access the logs ."
AuthType Basic
AuthzLDAPAuthoritative on
AuthLDAPurl "ldap://xxx.xxx.xxx.xxx:xxx/DC=xxx,DC=xxx,DC=xxx,DC=xxx?sAMAccountName?sub"
AuthLDAPBindDN cn=ldapauth1,ou=MiscAccounts,dc=xxx,dc=xxx,dc=xxx,dc=xxx
AuthLDAPBindPassword password
AuthGroupFile /etc/httpd/conf/groups
Require group madhu_grp
</Directory>
cat /etc/httpd/conf/groups
madhu_grp: madhu mike michael
Only the members in madhu_grp can access the log by providing their valid LDAP credentials.
LoadModule php5_module modules/libphp5.so
Also need to instruct apache to process the php files before sending it to client.
AddType application/x-httpd-php .php
Now the bot is running fine and logging the chats date-wise
But I want the logs to be only viewed by my team ,so i used the Apache to do this work for me using Apache's Authentication directive and configured LDAP authentication in it.
<Directory "/var/www/html">
AuthBasicProvider ldapAuthName "Enter password to access the logs ."
AuthType Basic
AuthzLDAPAuthoritative on
AuthLDAPurl "ldap://xxx.xxx.xxx.xxx:xxx/DC=xxx,DC=xxx,DC=xxx,DC=xxx?sAMAccountName?sub"
AuthLDAPBindDN cn=ldapauth1,ou=MiscAccounts,dc=xxx,dc=xxx,dc=xxx,dc=xxx
AuthLDAPBindPassword password
AuthGroupFile /etc/httpd/conf/groups
Require group madhu_grp
</Directory>
cat /etc/httpd/conf/groups
madhu_grp: madhu mike michael
Only the members in madhu_grp can access the log by providing their valid LDAP credentials.
Thursday, 23 September 2010
Hudson-QTP Integration
Let's see what i learned today. I tried setting up Hudson in my machine and integrated QTP with it to run the tests and generate the report in the Hudson . Then i used the below vbscript to trigger the QTP via Hudson . I didn't write the below script got the bits thru Google and modified it to fit my need. Yeah the script went fine and it generated the result in HTML format. So i used the HTML publisher plug-in in Hudson to view the reports in Hudson.
Dim qtApp 'As QuickTest.Application ' Declare the Application object variable
Dim qtTest 'As QuickTest.Test ' Declare a Test object variable
Set qtApp = CreateObject("QuickTest.Application") ' Create the Application object
qtApp.Launch ' Start QuickTest
qtApp.Visible = False ' Make the QuickTest application visible
qtApp.Options.Run.ImageCaptureForTestResults = "OnError"
qtApp.Options.Run.RunMode = "Fast"
qtApp.Options.Run.ViewResults = True
qtApp.Open "C:\hudson\workspace\QTP-Hudson\Test1", True ' Open the test in read-only mode
' set run settings for the test
Set qtTest = qtApp.Test
qtTest.Settings.Run.OnError = "NextStep" ' Instruct QuickTest to display dialog box
qtTest.Settings.Run.ObjectSyncTimeOut = 30000 'Instruct QuickTest to wait for 30 seconds for response
Set qtResultsOpt = CreateObject("QuickTest.RunResultsOptions") ' Create the Run Results Options object
'qtResultsOpt.ResultsLocation = "C:\hudson\workspace\QTP-Hudson\General_results" ' Set the results location
qtTest.Run qtResultsOpt ' Run the test
' Save the Run-time Data
qtApp.Test.LastRunResults.DataTable.Export "C:\hudson\workspace\QTP-Hudson\Runtime.xls" ' Save the run-time Data Table to a file
WScript.StdOut.Write "Status is:" & qtTest.LastRunResults.Status & vbCr & vbLf ' Check the results of the test run
WScript.StdOut.Write "Error is:" & qtTest.LastRunResults.LastError & vbCr & vbLf ' Check the most recent error
WScript.StdOut.Write "Result path is:" & qtTest.LastRunResults.path & vbCr & vbLf
sRespath = qtTest.LastRunResults.path
sResultsXML = "" & sRespath & "\Report\Results.xml"
sDetailedXSL = "D:\Program Files\HP\QuickTest Professional\dat\PDetails.xsl"
sShortXSL = "D:\Program Files\HP\QuickTest Professional\dat\PShort.xsl"
ApplyXSL sResultsXML, sDetailedXSL, "C:\hudson\workspace\QTP-Hudson\Results_Detailed.html"
ApplyXSL sResultsXML, sShortXSL, "C:\hudson\workspace\QTP-Hudson\Results_Short.html"
Public Function ApplyXSL(ByVal inputXML, ByVal inputXSL, ByVal outputFile)
sXMLLib = "MSXML.DOMDocument"
Set xmlDoc = CreateObject(sXMLLib)
Set xslDoc = CreateObject(sXMLLib)
xmlDoc.async = False
xslDoc.async = False
xslDoc.load inputXSL
xmlDoc.load inputXML
outputText = xmlDoc.transformNode(xslDoc.documentElement)
Set FSO = CreateObject("Scripting.FileSystemObject")
Set outFile = FSO.CreateTextFile(outputFile,True)
outFile.Write outputText
outFile.Close
Set outFile = Nothing
Set FSO = Nothing
Set xmlDoc = Nothing
Set xslDoc = Nothing
Set xmlResults = Nothing
End Function
qtTest.Close ' Close the test
qtApp.Quit ' Exit QuickTest
Set qtResultsOpt = Nothing ' Release the Run Results Options object
Set qtTest = Nothing ' Release the Test object
Set qtApp = Nothing ' Release the Application object
Dim qtApp 'As QuickTest.Application ' Declare the Application object variable
Dim qtTest 'As QuickTest.Test ' Declare a Test object variable
Set qtApp = CreateObject("QuickTest.Application") ' Create the Application object
qtApp.Launch ' Start QuickTest
qtApp.Visible = False ' Make the QuickTest application visible
qtApp.Options.Run.ImageCaptureForTestResults = "OnError"
qtApp.Options.Run.RunMode = "Fast"
qtApp.Options.Run.ViewResults = True
qtApp.Open "C:\hudson\workspace\QTP-Hudson\Test1", True ' Open the test in read-only mode
' set run settings for the test
Set qtTest = qtApp.Test
qtTest.Settings.Run.OnError = "NextStep" ' Instruct QuickTest to display dialog box
qtTest.Settings.Run.ObjectSyncTimeOut = 30000 'Instruct QuickTest to wait for 30 seconds for response
Set qtResultsOpt = CreateObject("QuickTest.RunResultsOptions") ' Create the Run Results Options object
'qtResultsOpt.ResultsLocation = "C:\hudson\workspace\QTP-Hudson\General_results" ' Set the results location
qtTest.Run qtResultsOpt ' Run the test
' Save the Run-time Data
qtApp.Test.LastRunResults.DataTable.Export "C:\hudson\workspace\QTP-Hudson\Runtime.xls" ' Save the run-time Data Table to a file
WScript.StdOut.Write "Status is:" & qtTest.LastRunResults.Status & vbCr & vbLf ' Check the results of the test run
WScript.StdOut.Write "Error is:" & qtTest.LastRunResults.LastError & vbCr & vbLf ' Check the most recent error
WScript.StdOut.Write "Result path is:" & qtTest.LastRunResults.path & vbCr & vbLf
sRespath = qtTest.LastRunResults.path
sResultsXML = "" & sRespath & "\Report\Results.xml"
sDetailedXSL = "D:\Program Files\HP\QuickTest Professional\dat\PDetails.xsl"
sShortXSL = "D:\Program Files\HP\QuickTest Professional\dat\PShort.xsl"
ApplyXSL sResultsXML, sDetailedXSL, "C:\hudson\workspace\QTP-Hudson\Results_Detailed.html"
ApplyXSL sResultsXML, sShortXSL, "C:\hudson\workspace\QTP-Hudson\Results_Short.html"
Public Function ApplyXSL(ByVal inputXML, ByVal inputXSL, ByVal outputFile)
sXMLLib = "MSXML.DOMDocument"
Set xmlDoc = CreateObject(sXMLLib)
Set xslDoc = CreateObject(sXMLLib)
xmlDoc.async = False
xslDoc.async = False
xslDoc.load inputXSL
xmlDoc.load inputXML
outputText = xmlDoc.transformNode(xslDoc.documentElement)
Set FSO = CreateObject("Scripting.FileSystemObject")
Set outFile = FSO.CreateTextFile(outputFile,True)
outFile.Write outputText
outFile.Close
Set outFile = Nothing
Set FSO = Nothing
Set xmlDoc = Nothing
Set xslDoc = Nothing
Set xmlResults = Nothing
End Function
qtTest.Close ' Close the test
qtApp.Quit ' Exit QuickTest
Set qtResultsOpt = Nothing ' Release the Run Results Options object
Set qtTest = Nothing ' Release the Test object
Set qtApp = Nothing ' Release the Application object
Subscribe to:
Posts (Atom)