Unimatrix_001
asked on
Using vectors and references - Cannot convert from vector to &
Hi.
void setBit(BIT &aBit, const bool aValue){
if(aValue)
aBit=1;
else
aBit=0;
}
int main(){
vector<BIT> bitVec[3];
setBit(bitVec[0], false);
setBit(bitVec[1], false);
setBit(bitVec[2], false);
return false;
}
I get errors about not being able to convert from a std::vector to 'BIT &'?
void setBit(BIT &aBit, const bool aValue){
if(aValue)
aBit=1;
else
aBit=0;
}
int main(){
vector<BIT> bitVec[3];
setBit(bitVec[0], false);
setBit(bitVec[1], false);
setBit(bitVec[2], false);
return false;
}
I get errors about not being able to convert from a std::vector to 'BIT &'?
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Actually, you'd better use :
vector<BIT> bitVec(3);
(to reserve space for 3 BITs in the vector) because immediately after you try to access the vector using bitVec[0], bitVec[1] and bitVec[2].
vector<BIT> bitVec(3);
(to reserve space for 3 BITs in the vector) because immediately after you try to access the vector using bitVec[0], bitVec[1] and bitVec[2].
ASKER