I have the following dataset df
UniqueID Col1 Col2
0 1234 5 NaN
1 1235 3 4
2 1233 NaN 3
3 1111 3 NaN
I'd like to know the number of rows where Col1 is not null and Col2 is null.
I'd obviously go with PiRSquared's.
If you, however, want to go something fancy with query, then use
In [430]: df.query('Col1 == Col1 & Col2 != Col2').shape[0]
Out[430]: 2
By using dropna
In [451]: df.dropna(axis=0,how='any',subset=['Col1']).Col2.isnull().sum()
Out[451]: 2
First off, in your example 'nAn' is not a null. So, let's replace that string with np.nan.
df = df.replace('nAn',pd.np.nan)
df.Col1.notnull().sum()
or
df.Col1.count() # Note: `count` ignores nulls where `size`, `shape`, and `len` do not.
3
And, use isnull to check for nulls explicityly:
df.Col2.isnull().sum()
2