freiheitpt
New Member
Hello all,
Well, as an asignment, we have to measure how much time it takes to insert 100000 rows of data in a database... the primary keys indexes (4 seperate tests) are the following:
the table is:
and finaly this is the algorithm to insert the 100000 data rows:
now... the problem...
in my server (on my laptop), the times are all equivalent (+/-230000ms).
but the times given by the professor show that they should all be diferent, the first beeing the one which takes the most amount of time, and the last one the one wich takes the less amount of time.
What could I be doing wrong?
(note: even the professor doesnt know what is happening. the code is also more or less done by him. the idea here is to "learn" about the differences)
Thanks!
Well, as an asignment, we have to measure how much time it takes to insert 100000 rows of data in a database... the primary keys indexes (4 seperate tests) are the following:
Code:
ALTER TABLE company.employee
ADD CONSTRAINT pk PRIMARY KEY CLUSTERED (ssn) WITH (FILLFACTOR=100, PAD_INDEX=ON);
GO
ALTER TABLE company.employee
ADD CONSTRAINT pk PRIMARY KEY CLUSTERED (ssn) WITH (FILLFACTOR=80, PAD_INDEX=ON);
GO
ALTER TABLE company.employee
ADD CONSTRAINT pk PRIMARY KEY NONCLUSTERED (ssn) WITH (FILLFACTOR=100, PAD_INDEX=ON);
GO
ALTER TABLE company.employee
ADD CONSTRAINT pk PRIMARY KEY NONCLUSTERED (ssn) WITH (FILLFACTOR=80, PAD_INDEX=ON);
GO
the table is:
Code:
CREATE TABLE company.employee (
Fname VARCHAR(15) NOT NULL,
Minit CHAR,
Lname VARCHAR(15) NOT NULL,
Ssn INT NOT NULL /*PRIMARY KEY*/,
Bdate DATE,
Address VARCHAR(30),
Sex CHAR,
Salary DECIMAL(10,2),
Super_ssn CHAR(9),
Dno INT /*NOT NULL*/,
);
GO
and finaly this is the algorithm to insert the 100000 data rows:
Code:
DECLARE @start_time DATETIME, @end_time DATETIME;
SET @start_time = GETDATE();
DECLARE @temp int;
SET @temp = 0;
WHILE(@temp < 100000)
BEGIN
DBCC DROPCLEANBUFFERS; -- Tem de ser administrador da SGBD
BEGIN TRANSACTION Insert1;
DECLARE @ssn int;
SELECT @ssn = round(RAND()*987654321, 0);
insert INTO company.employee values ('Maria', 'G', 'Sousa', @ssn, '2001-01-01', 'Rua XPTO', 'M', 1200, NULL, NULL);
SET @temp +=1;
COMMIT TRANSACTION Insert1;
END;
SET @end_time = GETDATE();
PRINT 'Milliseconds used: '+CONVERT(VARCHAR(20), DATEDIFF(MILLISECOND, @start_time, @end_time)) + 'ms';
GO
--teste
SELECT * FROM company.employee;
GO
now... the problem...
in my server (on my laptop), the times are all equivalent (+/-230000ms).
but the times given by the professor show that they should all be diferent, the first beeing the one which takes the most amount of time, and the last one the one wich takes the less amount of time.
What could I be doing wrong?
(note: even the professor doesnt know what is happening. the code is also more or less done by him. the idea here is to "learn" about the differences)
Thanks!